Hacker News new | past | comments | ask | show | jobs | submit login

> How are dark environments a big problem if you can see the buttons?

You can't see your hands.




I know where my hands are. They're attached to me. Getting my hands to a glowing button isn't a problem.

Or (earnest question) is this a problem which non-kinesthetic thinkers actually have?


I never have had a problem figuring out where my hands are in the dark. It's always figuring out where the damn "B" key is on my keyboard.


Light from the screen should take care of that.


If your model requires this many compromises, it's hard to say it's as good as one that doesn't.


I fail to see how having the screen of your device turned on while you use it is "a compromise".


A brightly lit screen and hand-eye coordination are not 'compromises' unless your goal is to build a device that is all things to everyone. There are non-tablet devices with highly tactile physical input, and there are also these mysterious inventions called 'light bulbs'.


Thing A requires some things to do X. Thing B doesn't require these things to do X. One is a superset of the other, therefore one is clearly better, no matter what the semantics.


Thing A requires gasoline and a starter motor to get you to places. Thing B doesn't require gasoline or a starter motor to get you to places. One is a superset of the other, therefore a bicycle is better than a car no matter what the semantics.

Are you serious?

The point is that tablet computers get used in different scenarios from desktop PCs and in different scenarios from physical mixing boards. Pick a device designed for your scenario, then complain if it's not fit for purpose. You're not going to have a mixing board on the train with you and you're not going to use a tablet to do a live mix at a dimly lit concert.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: