Hacker News new | past | comments | ask | show | jobs | submit login

This used to be somewhat true, but modern digital often uses such high internal rates that converting between the two is near lossless. For instance, the least common multiple of 44.1k and 48k is in the 7MHz region. Realtime conversion between them is pretty doable today. We've been doing it by using decimation with much simpler hardware for a few decades before that.



That might be true for the hardware, but since mixing of multiple streams is a software function nowadays (… some older HW was multi-stream capable…), any 44.1kHz stream has a good chance of being resampled to 48kHz just to allow mixing it with other sounds.

Even if it's the only stream and you could switch the codec to 44.1kHz mode, what do you do if the OS wants to play a random notification sound? Switching between 44.1kHz and 48kHz is not going to be hitless on a significant number of HW (not all, but most I'd guess), so whoever's writing your OS mixer code would reasonably make a call to always mix at 48kHz…

(Yes this argument primarily applies to PCs and phones, hopefully on a HiFi system that just happens to use COTS embedded devices they'd write some code to switch the rate…)


Well, my OS lets me configure the output device and specify the sampling rate. I would think that if I configure it at 44.1 kHz and I'm playing back some ripped CD and the system tries to play a notification sound, that whatever that sound is sampled at it'll ensure to output it at 44.1 kHz. Otherwise what's the point of the setting?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: