I'm the cofounder at Lumina - we're building a modern webcam designed to solve some of these problems.
There's really been a lack of innovation in the entire home office space, with the webcam being particularly bad. It sucks that a decade-old product (Logitech C920) is still the bestselling product today -- that would be like if Apple stopped releasing new phones after the iPhone 4S (launched 2012), and it remained the bestselling phone through now.
A few thoughts to add to the article:
- On why webcams aren't seeing innovation, I'd disagree that the market is too small. There's enough gross margin to produce a $B company just by selling webcams [0], especially if you can actually get customers excited about the product.
- A big reason there hasn't been innovation is that the space doesn't attract entrepreneurs (because hardware is viewed as hard) or investors (because hardware is viewed as hard).
- Size isn't everything. As the iPhone shows, you can get very good image quality from a tiny sensor and lens if you have the right tech supporting it. (At Lumina, most of our eng effort is on the software layer)
I would've loved to see Lumina in his comparison. We launched a few months ago and are seeing many reviewers prefer us over the Brio (Logitech's flaghip) [1]. Personally, I'd guess we're 60% of where we can be in terms of quality and think we can achieve a quality level between an iPhone and a DSLR, hopefully closer to the latter.
> Size isn't everything. As the iPhone shows, you can get very good image quality from a tiny sensor and lens if you have the right tech supporting it. (At Lumina, most of our eng effort is on the software layer)
This is my problem with all the webcam startups. So what if you can mask some of the problems of small sensors and lenses using machine learning that adds a whole new set of problems? You could have done that without even making hardware at all. We have plenty of crappy hardware out there already, and if yours is only a minor improvement with the "magic" in software then it mostly amounts to a hardware dongle to enforce your software license. No thanks!
If you're going to bother making hardware, you should make good hardware. That means a big sensor and a big lens. Start there, and sure go crazy with the machine learning afterward, you'll get much better results with less effort when you start with better input! And you'll have no competition because there's literally nobody else out there putting decent lenses on webcams.
I agree. It's a little strange to complain about a lack of progress in webcams the past decade, and then ship similar quality hardware to decade old webcams.
Why does everyone focus on a small form factor for webcams? Look at the size of microphones being sold today. The Blue Yeti is one of the most popular microphones and it's almost 30cm (12") in height. You might say, if you don't mind a large form factor, then go buy a DSLR or mirrorless camera. The problem there is they're not designed to be webcams. I watched one of the Lumina videos, and they were complaining the $2,000 Sony camera took forever to setup and configure for their demo. That's a problem. Someone solve it. Give me a large high quality camera that is designed to be a webcam first. I have lots of room on my desk, it doesn't need to fit in my pocket.
> I watched one of the Lumina videos, and they were complaining the $2,000 Sony camera took forever to setup and configure for their demo
Is that even a real problem? I started using my fuji mirrorless camera as a webcam during the pandemic and it was almost trivial to setup. All I needed to do was download some official software, plug the camera in and switch it on for it to start working as a webcam.
Same here: Used my Sony A7III with USB and their software as an excellent webcam and since I bought the new Sony A7IV, all my client always ask in video calls, what Webcam that is. At my new employer I had several coworkers asking me how the heck I did the insane background blur. (They didn't know it was a mirrorless)
If you have an old DSLR or a modern camera, you can use a 20USD Chinese HDMI Capture stick and get the best webcam on the planet.
Old DSLR - I think there’s an issue with latency that you’ll find on many older DSLRs - converting out to HDMI in real-time - that makes them impractical for use as webcams. Make sure you record yourself in a test conference first and verify.
I dunno what you count as old but my 8 year old Sony A6000 has basically no latency at all. You just have to look up some reviews beforehand and make sure that the camera you're looking at supports a clean HDMI output, combined with a decent capture card you can get rather great image quality for a reasonable price.
My Sony A6000 with a 35mm f3.5 lens was 350 euros second hand. Elgato capture card was 99 euros, a mini-hdmi to hdmi cable 8 euros and a dummy battery wall charger was 25 euros. All in all I get a 60 fps 1080p crispy image with great bokeh for under 500 euros.
There's always the Opal C1 that comes at around 300 dollars but imo the image is pretty terrible in comparison.
DSLRs can be pretty old - I have a Canon EOS on on the shelf that outputs a clean 1080 picture, looks great, and is over 15 years old. Looks great, but the latency in the video signal sucks. Probably same Elgato capture card you have.
You'd think that any DSLR from, let's say, last ten years would be just fine, but it all just comes down to the internals of the cam in question. I'm just saying it's wise to check the experience on the other end before putting an old DSLR into production. Out of sync audio is more distracting than lower quality video.
Speaking of cheap cameras - don't count out older camcorders either. An older (but quality) camcorder will get you drastically better image quality and control capabilities than a webcam too, same process as hooking up an older DSLR (capture card, etc). Same challenges to be aware of with latency, and maybe moreso. I've seen a camcorder produced in last two years that had too much latency in HDMI signal to be worth the trouble - it was lower end, but just illustrates the point that component quality varies and has an impact on your use case.
Except it cannot handle exposure changes at all, whether from a cloud or just moving around in the frame. So you AE lock and babysit. It’s just easier to link up your phone.
Smartphone cameras are small and far better sensors than what you get in webcams. They rely on software to produce images of the quality we're used to seeing, especially with less-than-ideal lighting, but the lenses and sensor are still far better than the Logitech C920.
A webcam doesn't need the latest and greatest video hardware. Give it a camera module from a midrange 2017 smartphone and use the latest image processing tricks, and you'll blow almost any webcam on the market today.
> Give it a camera module from a midrange 2017 smartphone and use the latest image processing tricks, and you'll blow almost any webcam on the market today.
Sorry, that's exactly the strategy of these webcam startups, and Apple with the Studio Display, and the results empirically suck. Even if you could get 2017 smartphone quality (which is still apparently an unsolved problem), why should you settle for that in a webcam that doesn't need to fit in your pocket or run on a tiny battery or cost a tiny fraction of the BOM of a much more complex device? We should be doing way better than that. And way better is definitely possible.
As stated somewhere else, at the end the video will go through very lossful compression and often be scaled down when used at work. I mean I'm now at my 4th webcam since the pandemic started because the other cams were: too dark, constantly zooming in and out for no reason (MS Cinema Lifecam), poor image. I now sticked with the Logitech C920 because I can manually set brightness/contrast/activate autofocus.
I think that understates how bad the current state of webcams really is. Esp. when you move around a lot while talking/there's an unusual light situation. IMHO first these core problems need to be solved (and then of course it'd be great be build upon that)
Bigger sensors and bigger lenses are exactly what would solve those core problems. Even after video call compression (which is not always extreme) it is trivial to tell whether a video was taken with a big sensor/lens or a small one. Compression is no reason to use a crappy camera. Garbage in, garbage out.
> So far no one has, there must be a reason no one can.
I am reminded of the story of two economists walking down the street. One spots a $20 bill on the sidewalk points it out to the other, who says "Don't be ridiculous, if there really were a $20 bill on the ground someone would have picked it up by now."
Welcome to what camera makers realized a long time ago - film doesn't matter, the processing does.
It's why DSLR was (is?) a thing - if you ever do research on one, you realize they factor in quantum mechanical effects to make your picture better...(in addition to the 20 or so simultaneous images they take in order to parallel process image).
I haven't kept up as much but - the flat sensor craze was all about how we didn't need DSLR anymore and could just get away with large pixel counts and better processing, and yet the processing still matters...far more than the pixel count. I think that was mainly because 100+ mirror systems are a bit expensive compared to a single flat sensor.
But when you think about them as 100+ parallel processing sensors peforming low-power auto-computation _for-free_, well.
That's another thing entirely.
I do wonder why some DSLRs suck at video - I suspect it has to do with whethey the manufacture cheaped out on the compression circuitry or not - some cameras can/could should shoot HD video, a lot did not, leaving that the pricier models.
At that price point you're competing with a GoPro, which is likely as good or better at a lower price. All (or most?) current GoPros can be used as a webcam.
Actually I tried to do exactly this. Except the gopro webcam software straight up doesn't work and they don't plan on fixing it.
Its depressing really.
That would indeed put a dent in that plan :\ I suppose there are always DSLRs, as those are also sometimes in a similar price bracket (many recent ones can act as a webcam as well).
I gather support / fixes / etc have been rolling out rather slowly, have you tried it in the past few months? Or is it currently that unusable? I certainly wouldn't expect the software to be difficult to build, there are quite a few webcam-emulating projects out there... but then again this is a company's software rather than an interested hobbyist.
GoPro is too clunky to carry around for use with a laptop.
Meanwhile, sensor and lens from a smartphone camera can fit in a small enough form factor that it makes sense to carry it around alongside your laptop.
Almost all laptops have a camera already, crappy as they may be, they still work. I don’t think laptop users are a big market for webcams? Isn’t it more for desktops where you have the luxury of space and bulk?
Growing trend in a niche market I would suggest. I know people who have 2 laptops - one for gaming, one for work (with webcam etc). Because the gaming laptop is too bulky to carry around, it's effectively a desktop.
if you are a gamer, you don't use a laptop. Gaming laptops are great portable desktops effectively. They have little autonomous time gaming or heavy compilations. So they have to be plugged in - they still are not great as a desktop gaming station, though.
I personally use a 'gaming' laptop as the default work computer (it's over 3.7kg w/o the charger which is another 600g)
A low end mirrorless camera is a fantastic webcam -- the only problem is it's clunky to set up that way especially on the software end. If you made one that's plug and play I would buy it.
Depends on who you are. "Nobody" would spend 1k+ on a desk mic, except unless you are upping e.g. your podcasting game.
I suspect a lot of vloggers, or at least wannabe, can and would easily shell out a measly 500$ for a webcam.
I agree, for a lot of people who just want to webcam mom that's going to be overkill, but if you have money, or do it professionally (I don't mean a remote work job) to the point where you are putting on makeup, investing time, often to the tune of thousands or tens of thousands of dollars, its a no-brainer.
The Studio Display does have a 27" 5K P3-color monitor attached to it, which drives up the price a bit...
Seriously, though, in the context of webcams, the Studio Display is kind of a counterpoint to "just rip a camera module out of a few-year-old smartphone and it'll be terrific". That's pretty much what Apple did here, and it's rather notoriously, ah, not great. (I have one, and I think it's better than some people give it credit for, but I know I'm comparing it to Apple's laptop webcams, which have historically been pretty terrible.)
Sadly, Apple's laptop webcams are still way better than the ones in competitors. I used an HP Spectre from 2020 or so and it had literally the worst digital camera I have ever seen, including the 90s. It's pinhole sized, and it made a brightly lit room look like a closet at midnight while being grainier than off-brand film. They might as well not have bothered.
There's no such limitation with a dedicated webcam, the kind you mount on top of your monitor. Yet these also are noticeably worse than what you'd find in a 5-year-old phone.
Resolution is just one factor. What processing you do to an image can easily make a lower resolution image look better than a high resolution image with horrible colors, bad contrast, poor tone curves etc. I'd describe the output on Macs as "unoffensive". It isn't great, but it also isn't bad.
Most people who dabble in photography and shoot raw will have some idea of this. The (untreated) images that come out of my professional DSLR don't look as pleasant as those that come out of my iPhone. Even if they have 4 times the pixels, and each pixel has much more dynamic range. How the raw sensor input is interpreted, influenced by intent, and then rendered, can be the difference between something you would hang on your wall and something you'd delete.
I have half a dozen macs with cameras, including the 16" notched M1 Mac Pro. None of them produce a great image, but they won't stand out as bad in a video conference. My $1000 camera does stand out because of its poor image processing, especially in low light conditions. It is a bit sharper, but the colors look distinctly off.
For what it’s worth, I agreed with the cynical view of GP until I read this comment. A very convincing defense of the prioritization of the software layer.
Completely agree, this is the old adage: "Garbage in, garbage out". Improving the data the sensor collects is surely much easier than using ML models to correct after the fact. The people willing to pay a premium for nice big lenses/sensors are the market to go after, everybody else won't care since they already feel their current set up is good enough.
I expect mobile phone cameras to kill webcams. All the innovation happens there and when people already have something getting closer to dSLR quality each year, why invest in a specialised webcam?
Apple just needs to bake webcam sharing/remote camera control into iOS and come up with a nice attachment for the phone that works with MacBooks and displays so you don’t have to look at the camera at a weird angle.
Maybe some magsafe sticker behind the display so you can snap your phone to the back of the display easily, with the main rear cameras pointing at you above the display
> Apple just needs to bake webcam sharing/remote camera control into iOS and come up with a nice attachment for the phone that works with MacBooks and displays so you don’t have to look at the camera at a weird angle.
> Maybe some magsafe sticker behind the display so you can snap your phone to the back of the display easily, with the main rear cameras pointing at you above the display
You do realise they just did exactly this, right? :)
But look at it and see why it doesn't really work.. It's an awful contraption, and your phone is tied up for the duration. And I doubt it will work well with every case either.
Apple already did this as discussed, but I’m going to make a different point:
This whole idea is against what Apple wants to be and so I think they will kill it (“evolve past it”) as soon as they can. Using an iPhone as a webcam is below Apple’s threshold of sleekness.
Apple would be the kind of person who doesn’t have anything in their pockets because that would “ruin the lines”, and when they need to buy something at a store instead of carrying a credit card they just call their lawyers and tell them to buy the whole thing so they are free to take whatever they want and write it off inventory.
When they interact with someone who is not as sleek, they look down and imply they are poor. “Why would you carry around that big thing (card) when you could simply buy the store? I don’t know if we should associate with each other...”
I frequently use my phone while in a meeting. I can't see how using your phone as a webcam would be productive. Using a phone, like an old iPhone model you have upgraded from, would be neat, though. But there are a bunch of apps that let you do this today, I believe.
This appears to support just photos and document scanning. And also seems to use BT and WiFi.
I have a friend who lives in a high rise with incredibly crowded wifi spectrum. Anything wireless that involves latency sucks in their apartment. They tried to use an old iPhone for VC, but the jitter added by their crappy wifi made that a non-starter. By contrast, this product uses a USB connection, and I'm guessing would perform far better for them (and really anybody with crappy wifi, which is most people).
I agree that it would be awesome to use an iPhone in this context. But given the problems my friend has, and given the problems in general with mac wifi and latency, any use as a webcam would be far better if it were wired. Hopefully that will leave a niche for this company. That, and using Android phones, and hooking any phone to a Windows box.
Agree, the flat and tiny webcams aren’t really necessary at home or in the home office and I don’t know anyone who carries even the smaller webcams around.
As long as it doesn’t look too bad, a larger lens and sensor would be perfectly fine (think of the original iSight camera, something like that would totally be acceptable) and I’d rather not have my computer be slowed down even further by the webcam doing some magic in the background.
Agreed. Optics aren’t that hard. If you put a good sensor in something with a form factor similar to the old iSight camera, seems to me like it would be a winner.
People seem to have embraced those stupid circle lights everywhere, why is a decent camera a bridge too far?
I bought a Lumina in December. It still doesn't do automatic tuning of the video. Everything requires manual adjustment. I can't just have it...work. It requires tons of adjustments. It even ships with a color card, which I assume is to automatically tune things, but it fails miserably. I have tried tuning it a dozen times, but maybe I can get it right until the light changes (I am by windows). Finally, after about 5-6 months of giving leeway, I gave up and uninstalled everything, put it in a box in my attic, and went with a 5 year old webcam that doesn't look amazing but doesn't look bad.
I have noticed you keep shipping a bunch of random features. I could be wrong, but my recommendation is to try and get it to automatically work right for 99% of cases without manual user input.
Agreed. I backed it and was really hoping to like it. I've since switched back to my Brio. I felt like it took work just to get the quality I was used to with my Brio. It felt like it added more steps and the end result was only on par.
Does Lumina work on Linux? I can't really find anything about it (there is also a Lumina Desktop Environment for Linux it seems), and while it says "Lumina is compatible with Windows and Macintosh machine" this can mean "we don't directly support Linux, but it should work" (like Logitech), or it can mean "we have specialized drivers/software that only work on Windows and macOS".
If your software works only on Windows, you've written non-portable software. Likely, you have all your core logic mixed up with Windowsy UI and now you're stuck there.
If it works on Windows and Mac, you've written portable software but just can't be bothered to support Linux. At a guess the problem is that you want to ship binaries only and don't want to deal with doing all the packaging and testing for Linux.
For a consumer hardware company, one wonders what the benefit is to binaries-only unless you're hoping to charge a subscription for the software to run hardware you already bought. Or the software is such a mess internally that you'd feel ashamed of it were public (which I can understand, having seen commercial code). Or you have some security feature that you don't want to reveal (e.g. disallowing use of your software with third-party hardware). However, if someone wants to reverse your software methods, they'll find a way.
Perhaps naively, my thought would be that if you want to sell hardware and not software, an excellent consumer-grade and consumer-priced camera platform with an open hardware interface and a decent reference implementation would actually find a lot of uses on embedded (i.e. probably Linux) platforms.
When you write drivers there's no such thing as portable code. You need to target different APIs, different environment assumptions, etc. Sure, there's some common core most likely, but I think this rant is misplaced when talking about a hardware company.
The software postprocessing is a different question and could be more common... But you're still specialising it for DX / v4l2+gui / whatever macos uses. That said... I'd kill for an independent library for video processing that specialises in camera outputs - beyond different profiles, there's no technical reason that couldn't be used independently of hardware.
Maybe in 2002, but it's a USB webcam in 2022, if you're writing your own drivers from scratch on any platform rather than using the standards-compliant ones, I'd first wonder if you haven't got a case of NIH syndrome rather than an actual case of being constrained by the standards.
But maybe they genuinely cannot use standard protocols for some reason, then sure, writing a Linux driver might not be worth it.
I'm half assuming they do something special, half responding to the comment which describes them as a hardware company. But yes, ideally a webcam should have the basic functionality available just by plugging it in, no special software needed.
The postprocessing should be done in the cam though, IMO.. And it should just give the result back ideally in a standard format.
But I would prefer better optics and sensors over AI stuff like many others have mentioned. When you have good video you have much less need to fiddle with it.
I worked for a company that did pre-infiniband HPC interconnect drivers (Myricom) in the 2000s. We supported N-different *nixes (Linux, Solaris, FreeBSD, Tru64, Aix, and probably more I'm forgetting), Mac OS X and Windows in the same codebase. At least 50% of the driver and 95% of the userspace code was OS independent. In order to do this, our drivers were not "typical" drivers. Eg, we used ioctls everywhere (even on OSX) for driver/library communication.
I don't really get this. Linux has had the same problem with audio interfaces, and you know what the solution was? Class-compliant USB peripherals. That's probably what people are asking for here: nobody wants to install additional software to make their webcam work, so why can't the processing be done on-device (especially if they're charging $200!).
If there are selling undifferentiated hardware supported by smart / magic software, they need to block people from running their fancy software on somebody else’s cheap hardware.
If they want to get paid for the magic, it will be more expensive to buy hardware from them than it is to buy equivalent hardware elsewhere. How do they get you to buy their product?
My thinking was that it might use some platform-specific APIs for that AI stuff it mentions, or maybe it requires some non-standard extensions because of it. Whether that's a good design or not is kinda besides the point here.
I might be interested in this camera, and asking support will likely only generate "we only support Windows and macOS" without any nuance, so I figured I'd ask here. Better ask before spending $200.
It may be confusing because it's in reply to "Does Lumina work on Linux?" but doesn't answer that question. The tl;dr of the answer would be, "there's reasons it might not work, but I don't know"
From the Obsbot site: "The OBSBOT Meet series hardware is able to work with both macOS and Linux, but we will not make the exclusive software for the Linux operating system."
webcams on linux with good driver support are not just for use by humans but in a myriad of industrial/embedded applications, for single image acquisition, etc.
don't just think about people using linux desktop PCs in home offices, but all forms of other use of camera systems.
I can understand the reason for believing there's no overlap, as Linux users tend to shy away from tracking-like features in general (which is the job of a camera!). But a few privacy features like a physical lens cap, and a hardware light that turns on if the camera is powered (not controlled by software) would really incentivize that market, I think.
Linux users still like quality hardware, and are willing to pay for it.
> I can understand the reason for believing there's no overlap, as Linux users tend to shy away from tracking-like features in general (which is the job of a camera!).
I don't agree. This is not the same 'tracking' as the one that Linux users generally abhor.
I'm a privacy nut and I track a whole ton of stuff around my house by the way and for my health etc. The difference is just that I track this stuff on my systems. Not on Samsung's or Xiaomi's or whatever. I use their brands actually but with privacy-conscious software. For example the amazing FOSS "GadgetBridge" app works great with Xiaomi's smartwatches and fitness band, giving them zero internet access and thus no way to track me. Yet it still collects the data for me. I know I should support a FOSS hardware product too but the only one around, the PineTime, is too premature to be useful IMO (I do own one). For my house stuff "Home Assistant" is amazing and it supports many health devices like scales also.
So, tracking is not bad. Companies tracking me is bad.
> But a few privacy features like a physical lens cap, and a hardware light that turns on if the camera is powered (not controlled by software) would really incentivize that market, I think.
Yes a lens cap is a must. I bought an aftermarket one for my Logitech C920. It seems to be something that's more on the radar now as many laptops have them again (like Lenovo).
> Linux users still like quality hardware, and are willing to pay for it.
In attempt to disabuse this notion that "nobody uses linux"...
Spoken like someone who only eats from fast food restaurants.
Fast food, for the most part, is horrid, disgusting, greasy, bad for your general health. Sometimes, I'm out and about, and I eat from a fast food website.
I also cook. Lest you think I'm in some minority, the cooking business is a billion dollar industry, with multiple "trendy" startups (hellofresh, blue apron, etc.) selling pricey services. I've even used some of them.
Yes, some people only eat fast food. Some people know no better. Yes, eating non-fast food can be incovenient, but its incredibly good for you... and just because we do good things doesn't mean we are all starving artists or paupers.
But no, we are probably not sitting 400lbs overweight surrounded by burger king wrappers filled with xenophobia at anything that's not "like us".
I purchased and returned a Lumina in January. I was hoping it was "good enough" to replace my current setup (Fujifilm X100f with a battery blank and Elgato camlink) and free up my family camera to do family camera stuff again.
It was better than the built-in camera on my gaming laptop, but nowhere close to what I was expecting (even moreso because I was on Windows). More "okay-to-good webcam" than just "good" quality without pairing that word with "webcam." I'm on the waitlist for the Opal C1 and will give that a shot, but honestly might spring for the next-gen Fuji and turn mine into a permanent webcam (too bad my first generation X100 isn't supported by their webcam software update).
1. If you're saying that it's mostly software, then you shouldn't have created hardware. You should release and sell software to run in tandem with existing cheap webcams. Not a low-quality webcam with some magic software.
2. Saying the iPhone is proof that you can make good, small cameras is bunk. The iPhone has an enormous amount of horsepower at its disposal. And an enormous team of software engineers (in addition to hardware engineers). A small startup with a couple of engineers isn't going to replicate Apple's camera team. Plus, are you aware just how incredibly powerful Apple's SoC is? To replicate it, in addition to the world-class engineering team, you'd either need to offload all that to the host CPU (which won't have dedicated hardware for it), or build it in, which you can't for the price-point you'd need.
I recently bought the elgato FaceCam, and honestly it's pretty good for $150. Much better than Logitech. It has a pretty large sensor and good software (and no autofocus!), but there's nothing magic about it.
The iPhone has a high-res display, hi-capacity battery, associated regulator hardware, multiple SoCs, cost of supporting a software/firmware engineering team for the OS/drivers, app developers, UX/Design experts, company markup....
Not cheap. I don't buy this argument - they have a nice image processor, sure, there's nothing special about their image processing. Sure, its not going to be cheap, I think you are vastly overestimating the quality... for a price point of 1K per phone, even at scale, you are talking maybe a 5th operating costs (200$) going towards supporting that.
Remember Apple makes its money from app commissions, more than the hardware, and AFAIK they don't operate selling hardware at a loss.
You will need a SoC team, yes, you are looking at a multimillion dollar company, doable with VC I'd imagine. Or a team of brilliant-yet-bootstrapping individuals, willing to work for less.
What sucks most is that Logitech, which used to be a proud name (back when "Feels good / Feels better") has gone totally to shit. Even their mice are designed to fail in a year. (Spill on the desk, the mouse that only got damp on the bottom is dead, dead, dead. That can't be accidental.)
A full article on what went wrong at Logitech corporate could be enlightening, if not helpful.
Maybe I'm lucky, but I've got half a dozen Logitech wireless mice/keyboard-and-trackpad media keyboards in regular use, plus the MX Ergo trackball, plus a couple of G Pro keyboards, and I haven't been able to kill any of them even through pretty rough use on the mice particularly (they tend to get bounced around in the living room when the dog jumps on the couch, etc.).
"Can't be accidental" reads as conspiracy-huffing. Lemons exist, bad products happen, but this is a weird assertion without evidence. Like, I got bit by the MX518 double-click bug back in the day, but everything I've used from them since--and I won't buy Razer because their stuff's awful to look at, which has historically cut down the options a lot--has been unremarkably fine. Except for the MX Ergo. That thing is remarkably excellent.
You talk like you have never heard of Planned Obsolescence. But it has been in business school textbooks since probably before you were born.
Whirlpool refrigerators have a "light control" circuit board to make the dome light come on slowly when the door opens. It has two resistors carefully underspecified to fail after warranty end, requiring a $150 replacement board. (Lots of pics online.) Good for them, bad for us. A $2 transformer and diode would last forever.
Whirlpool washing machines have a Motor Control Unit board carefully designed so parts on it literally explode shortly after warranty end, requiring a $300 replacement. (Lots of pics with exploded parts online.) There is of course no need for the parts to explode, and commercial washers from the same manufacturer do not share this failure mode.
There is literally no legitimate reason why moisture on the bottom surface of an optical mouse should possibly have any effect on its operation or longevity.
> There is literally no legitimate reason why moisture on the bottom surface of an optical mouse should possibly have any effect on its operation or longevity.
Sure there is. You got a lemon with an unexpected gap in the base panel and electronics don't like water much. That happens; that doesn't mean it's a defect by design, and the shouting about it needs a lot more substantiation than "well have you heard of this thing in another industry entirely?". Tolerances exceed, stuff breaks, warranties are sometimes necessary. This sniffing about capital-P Planned capital-O Obsolescence when this stuff works in the main quite well for a lot of people (and, as mentioned, my own pretty wide array of hardware from them) has a long road to hoe to substantiate it.
You buy a dishwasher once a decade or so. You buy mice rather more frequently than that. From where I sit, there's more incentive to hand you a decent product so you come back and buy another from the same manufacturer--and, as it happens, that's much of why most of the input devices kicking around my house and my studio are Logitech, because they're generally a decent floor of quality.
Lifelong user of Logitech trackball mice, now using the MX Ergo. I also used a Logitech keyboard for many years, and am still using a Logitech webcam. I've had an equally pleasant experience of reliable hardware with good Linux support.
No need for conspiracy - their mouse buttons are run out of spec which guarantees the infamous double-click. This is an issue on pretty much all their (esp. wireless) mice.
Great mouse but always am missing the scroll wheels from the MX Master. As soon as they release a new MX Ergo with upgraded wheels, I’ll be upgrading. Just smoosh the two together and it will be excellent.
Their mice still have the same terrible button that they had 17 years ago when I bought an original MX 518 as far as I can tell. Ive fixed the button so many times on every mouse from them since them. I even made the mistake of buying the re-released mx-518 a year ago thinking that since they put this new fancy sensor in maybe they fixed the button finally. It took 4 months for me to get a double click problem from it again that required me to open it up and fix that damn microswitch. I don't know what mouse ill get next but ive completely given up on logitech.
One of the reasons their buttons suck is: effectively running the microswiches out of spec. D2f-01f requires min 5V with micro loads[0]. There would be a tiny spark during switching and over time that would prevent a perfect contact, the higher voltage helps penetrate through. Omron has factories in different countries, e.g. Japan and China - with the China switches "known to be not as good as the original ones". Of course the mice feature the China versions too.
In other words - your best bet is still replacing the switches to ensure longer life... I wonder if Logitech would ever consider stepping the voltage via capacitor charge pump to prevent double clicks.
True, it was a great brand when they were growing. Then they got big and the bean counters started shaving away at quality. See all the problems with the G815/915 keycaps (I looked into it because I was thinking of getting one)
I use Microsofts $5 "Basic Optical Mouse" mice now, which generally me for years and is surprisingly comfy (though typical $5 creaky plastic). I have RSI so I'm fussy with mice but this one is pretty good for me. So even with beancounting it's possible to make a quality and lastign product. I've only had to replace two of them in the last 10 years or so (and I have 6 of them in use). Both were still working even but the plastic was just getting too creaky and glossy from all the use.
The software is an abomination too - they switched from a native program to an electron app that I'd describe as "occasionally functional". And you do need to use the app to do simple things like checking battery of wireless headphones.
The G502 is a reasonable replacement, though the shape's a little different. If you're good with wireless, I like the G602 as my normal go-to games mice.
> that would be like if Apple stopped releasing new phones after the iPhone 4S (launched 2012), and it remained the bestselling phone through now.
If I can be a bit cheeky - this sounds like exactly the kind of frankly ridiculous comparison a founder would make about their product alright.
The reality is - nobody really cares that much. Whatever image you record is going to be gigga-smashed by whatever application you squeeze it through. I switch between the rubbish builtin on my laptop over wifi, and a gopro tuned to the highest res and framerate the cable will take, with colour and exposure tuning etc.
Not once person has ever mentioned webcam quality in either case. As long as you can vaguely see _most_ of someone's face in _somewhat_ balanced light - that's good enough. For 99% of people in 99% of cases. Even job interviews where image is everything, it's irrelevant.
The problem that needs solving is audio. That actually matters.
I tried ping.gg recently which boasts high quality video and audio feeds (for a high price) and even then - meh. Video was entirely unimportant.
The only people who really care about live video feed quality are content creators with high powered static systems, like streamers. Even then, they can just hook up a dslr and smash it out of the park with little to no effort.
I don't see this as a real problem anyone bar a select few care about - and that select few has already solved the problem anyway.
> and think we can achieve a quality level between an iPhone and a DSLR
Good luck, but I think you're filling a spectrum nobody is concerned about.
I don't agree, my boss uses his iPhone a lot and I have commented on the quality of his video and I've seen tons of others do the same. This is with most people using their laptop's cams though which are admittedly terrible. And with MS Teams which is not very good at coping with poor video sources (even the free Jitsi is tons better).
Audio is important too yes but we all get noisecancelling headsets in work that make this a non-issue. I live in a really noisy street and most of the year I have the windows wide open (Spain :) ) but curiously they don't even hear it unless one of those d*cks with a muffler-less motorbike races past.
By the way how does the gopro work out as a webcam? Never thought of that but it might be an option.
> By the way how does the gopro work out as a webcam? Never thought of that but it might be an option.
It works very well. The newest versions can be used directly and natively in webcam mode. Possibly even wirelessly. I have a hero 7 black. I use a cheap adapter and it works flawlessly. One thing to bear in mind is processing power for real time is limited - you won't be able to stream in full 4k with stablisation and so on. But you'll get a high quality feed from a dedicated camera, from which you can tune exposure and so on. Also, the fisheye + different lens output selection is actually quite nice.
I commented elsewhere addressing the rest of your comment - selection bias here is extremely strong for "highly specialised tech nerds who care" when my argument is is "the world at large doesn't care".
> Audio is important too yes but we all get noisecancelling headsets in work that make this a non-issue.
It's not the audio out that matters. It's the audio in. Everyone uses shitty mics.
Because his opinion is supported by the status quo (crappy cameras everywhere).
My anecdata says the same. In most video confs your video is put up on maybe 1/16th of the screen, to the corner, and people generally share something like a presentation. I guess if you're in sales, it matters more since your face will be fullscreen.
But even then... most people don't work in shiny offices anymore, are you sure you want to display your dirty socks on the shelf in the back in all their 8k glory? :-D
Camera aside - people will do wonders for their video presence by putting some effort into their background and lighting. If you’re not going to use a blur or picture background, clean up the area, consider how it functions as a backdrop (including wash lighting, etc), and what impression it’s making. Lighting is king - be deliberate about it.
Completely agree with you on audio. I’m not entirely certain it’s solvable without cultural norms changing. Even the absolute best noise reduction systems cannot work in every environment and echo cancellation always has enough delay to cause people to “step” on each other. It’s just physics and propagation of sound waves.
At a minimum, normalizing the use of in/on ear audio needs to become the norm. Echo cancellation makes most “speakerphone” setups unbearable in my opinion.
I suppose my comment does seem like it’s talking about background noise. My major issue is echo cancellation. Background noise is a factor that really messes with echo cancellation but, if everyone is using on/in ear audio then it becomes less of a factor.
The amount of comments I got on my video quality once I set up my mirrorless with proper lights is insane. Nobody cares until they see someone with production quality video and they notice right away. It makes a difference. It's like if nobody cares if you wear a tee and a hoodie, but the second you break the norm and dress a bit nicer people do notice.
What even is an "AI powered" camera? You mean a device that will become useless when your company goes out of business in a few years, or shifts focus, or that only ever works on specific versions of Windows/macOS due to some required software?
No, what people want is a decent modern UVC webcam that adheres to USB standards.
I have the Lumina! I think the hardware has been pretty good from the very start when I got it about a year ago. The software originally was... not very good, but you guys have improved it by leaps and bounds over time. I still have to manually adjust the exposure (which is fine), and fiddle with settings to get the colors right, but it works better than any webcam I have so far. Keep up the good work.
I think mics and webcams suffer a simple problem: the person who enjoys the benefit of higher quality is not the operator. Indeed, the operator may never experience how they look or sound.
That might be true, but your reputation is at stake as well. For the same reason that you might wear a suit to an important meeting, you might also take steps to ensure that your video and audio quality are presentable when you're having an important meeting virtually. (Of course, once you've paid the cost for that, you can just do a good job for all meetings. Much easier than a trip to the dry cleaners.)
Not even just reputation. I have seen people with poor (often too soft) audio get 'snowed under' while trying to raise a point. It's much easier to ignore someone who is less 'in your face'. In that sense I think it's really important for effectiveness in meetings.
Appreciate the effort. I think one thing that hinders innovation is that, as soon as anyone makes a decent camera that sells well, a company like Logitech will just try a little harder and bring a better camera to market at a lower price point. It's not like this is rocket science, Logitech just doesn't have much incentive to try very hard right now.
Your product looks intriguing, does it support Linux at all?
One concern I have to a "smart camera" as a Linux user is that I won't get the simple plug and play experience I've come to expect from Logitech's products, as there's often a required software component/driver to get some or all of the devices functionality.
I know I represent a small part of the market, but I figured it doesn't hurt to ask.
I was thrilled to see Lumina and bought one. I unfortunately ended up having to returning due to a combination of buggy software and image quality that was no better than my laptop built in webcam. I really wanted to love it… but in practice it was no better than what ships by default. Also had tons of issues controlling white balance. The software depth of field was a bit better than what's built in to Zoom/Google Meet but still doesn't come close to the illusion of a real lens effect.
Ended up going with a much clunkier and expensive setup with desk stands + a Sony a7c + a 28mm lens + a Yeti mic.
Another thing worth mentioning is that lighting is key — so I got a 3 point key light setup (LumeCube lights) which works well.
It might've been a matter of Lumina still being in beta — I hope it can improve to a point where this comment becomes entirely obsolete.
i had to abandon my review issue of Lumina because the constant refocusing every time i moved my head was far too distracting for work. could you please give us an option to turn that off?
First, I like what Lumina is trying to do. I pre-ordered, and since receiving the first batch, have been, and will keep, buying them.
However, caveat emptor. Right now, buyers should support Lumina to support innovation in this space, but be prepared to pay a quality price while they work on it. Several ideas didn’t work out, and they’re trying to correct. Each software release is better, but I worry the hardware sensor in the shipping models may just not be up to task.
I have four Brio and four Lumina in our office (several more just came!). LogiTune + Brio 4K is surprisingly (given Lumina marketing including the post above) better than Lumina in most every situation, and drastically better in high contrast lighting where Lumina shows severe washouts and color banding. Some of the article’s shots show washed out yellowing on the author’s face. Make that problem worse, and you can picture what Lumina is struggling with in some lighting. It’s probably good that he didn’t do a comparison yet, though I would note that the most recent software release might have done better than 75% of those tested in his particular setting.
Meanwhile, if you like Lumina’s features but want better video, Logitech Brio 4K + Xsplit Vcam gives the same “virtual cameraman” pan and zoom, bokeh background softness control, virtual green screen, logo/watermarks, and more. Just like Lumina that requires a separate app for “pro” mode (aka software processed), Brio 4K (also most any webcam works) requires Xsplit Vcam to enable those features.
With Lumina the extra app is just enough easier for non-tech users to remember, that’s the direction I’m leaning for our staff.
Going forward, I’d pay 2x the Lumina price for (a) large aperture and high dynamic range sensor running off USB-C, and another 50% for configurable onboard features like the pan and zoom that wouldn’t require a helper app and confusing settings in Zoom, Teams, etc.
It’s certainly possible within the price point. What I likely want is something like this on USB-C:
The quality on this 4K sensor + lens array is remarkable, the optical tele-zoom fantastic, the low light gathering shockingly good. The price point includes a full PoE camera engine and an audio mic.
True, so that wouldn’t fit in a laptop bag. But iPhone optics would fit easily in a Brio housing, and long throw zoom would fit in the Streamcam form factor. Given videoconf is now a fact of work, I’d argue innovators should not be afraid of a larger device to achieve better quality.
Like the author of this article, while I used to use a Sony A9 for web conferences that matter, now I use iPhone + Camo for high value teleconfs. It doesn’t have to fit in my bag, I’m already carrying it. The innovation needed here is for the mount.
All this said, Lumina’s software is iterating fast, their cameraman is easy and works well, and I am buying more of the Lumina over the Brio for our staff.
// @rlei: a few usability notes (1) give us a toggle to turn off the default background image (e.g., show black instead) while virtual camera is panning in, the default scene generates annoyed comments from participants, (2) don’t flip from black to the live image until the initial pan and zoom is done, and (3) consider offering an eye-contact filter and marketing that. Staff who prefer Brio affirm they would switch just for that. Also, (4), don’t make users search email to find a serial number to use the helper software with their Lumina camera.
I don't think it even needs to be this big. These outdoor security cameras have big metal enclosures for heat dissipation, weather shielding, PoE circuitry (transformers) and other stuff that a webcam won't need. There's also the "bigger looks better" effect in this market. After all, most of the applications of these rely on deterrence and it's important that potential thieves notice them. I wouldn't be surprised if that case is half empty.
I'm sure they could make a webcam with these optics and processing in a respectable size/weight. I really wish they would, too.
I'd love to try a Lumina webcam but at the moment it'd cost me $200 shipping and import tax to the UK.
It's a pretty big punt compared to buying something off Amazon that I know I can send back without a quibble if it's no better than my Logitech Streamcam.
Nice! I'd like to ask: as there's a growing trend for (e)motion-contolled 3D personas/avatars driven my conventionl webcams, are you considering that as one of your potential directions of development? As a potential buyer, I'm in need of an inexpensive webcam with invisible IR backlight that would reliably capture facial expressions at moderate resolutions (up to 1280*960) even in a dark room without being distracting or harmful. Fast internal basic encoder (maybe primitive, but fast) would be nice too.
If you can do enough in pure software why are you developing hardware at all? Why not make your product a software offering working with any webcam? Or OEM an existing webcam with your software?
> A big reason there hasn't been innovation is that the space doesn't attract entrepreneurs (because hardware is viewed as hard) or investors (because hardware is viewed as hard).
It is so easy to throw yourself into software, I wouldn't know where to start as an amateur with hardware, it involves electricity and circuitry and probably months of learning. An amateur can scratch together a proof of concept in a few weeks.
Aren't people who start businesses in a particular space professionals in that space already? It's hard to imagine someone who is new to software jump in and try to start a software company
Not always, sometimes its just an interest, or it doesn't necessarily mean they have the technical means? For example a restaurant owner is not necessarily a technical expert, but could have an idea for a generic product that benefits every restaurant out there, and prototypes something, and gets funded and hires real devs to rework his prototype into a functioning viable product.
I have met and interviewed with YC founders who are not developers by any means for software heavy projects.
> A big reason there hasn't been innovation is that the space doesn't attract entrepreneurs (because hardware is viewed as hard) or investors (because hardware is viewed as hard).
Why though? It's all off the shelf stuff. If it's in phones you can put it in your product.
> that would be like if Apple stopped releasing new phones after the iPhone 4S
Would that be such a bad thing? Arguably there haven't been many technological leaps in the years since and if it weren't for planned obsolescence the 4S would still be a totally fine phone
Sad to see such an imperfect monitor mount type on a webcam that tries to be better. For better eye contact, lens should be as low as possible to be as closer to where you are looking as possible. This one doesn't seem to even try.
>There's enough gross margin to produce a $B company just by selling webcams [0], especially if you can actually get customers excited about the product.
Is this from HW only or some sort of MRR from SW lock-in?
Imagine you ship a webcam and then find out that one of the capacitors you used is undersized and the camera randomly reboots when another USB device is plugged into a nearby port. You just shipped 5,000 of these and customers are unhappy.
Unlike software you can’t just issue an update. You must manufacture new devices, issue refunds, send replacements. And all the while your competitor sells a very comparable $50 device so you can’t charge a subscription fee and make thousands off each customer over a lifetime. Software margins are huge compared to hardware and risk is often much much lower.
This is wild. What, how is this justifiable, why would anyone pay that amount when you can get an old
DSLR and produce amazing quality video with some tinkering, or get Logi Streamcam or similar for as good quality.
I don't understand the business model / viability of dedicated webcams. Software like Elgato Epoccam or OBS is free or cheap, and the % of people who would benefit from a decent webcam and who don't own a modern smartphone is literally 0.
People who have video calls throughout the day constantly. I don't want to have to mess around setting up my phone on a stand and getting it connected every time I need to do a meeting.
I would sound a slight note of caution there, in that AFAIK not every camera has clean HDMI out and there can be challenges about powering cameras for longer periods of time (if needed).
It is a good option though if you need higher quality. I'm using a Sony ZV-1 and it's worked out pretty well.
Those cost more. It doesn’t make sense to spend more than needed on something you’re only going to use as a webcam. Webcams don’t have to spend on holdability or physical durability or any viewfinding.
I also dual call into many meetings. For these, I use my phone to see how my screen sharing is working. I have to do a lot of screen sharing presentations to present data (tables, figures, etc). With this use-case, it’s very helpful to have a second login to be able to see what everyone else is.
Or I’m in a group meeting and I see a text or even a phone call I want to quickly respond to. Apple’s demo at WWDC seems interesting but not sure how much I would actually use it unless it was to partially repurpose an old phone.
As for cameras, I got things setup with my Canon 5Diii but it was way too much trouble to use given even pre-recorded Biddeford is most of the way there with a Logitech C920–and I’m better than most of the people recording videos.
Ive only ever had trouble with using phones as webcams. It seemed like a great idea, but ive probably wasted a good full 40 hours of my life trying to make them work right. And when I get it to work, 6 months go by and they force you to "update" to more broken features and then charge money for basic functionality. Whoops this software doesn't let you flip the camera the right way. This one doesn't like USB for some reason so lets try wireless, whoops it completely saturated the wireless connection with completely uncompressed video and drains my phone faster than it can charge. This one ghosts horribly if you move slightly too fast. Oh this one seems to be work... what the hell I dared put my hand to close or a light source came in view for a second and it completely bjorked the exposure levels and fails to readjust again leaving an either mostly black or mostly white picture.
Since I don't use it constantly, if I expect to be using a webcam I need to make sure I have atleast a couple hours the day before I need it to make sure it actually works when I turn it on again. And even then I still run in to troubles where certain programs see it and certain ones don't. Then I gotta push the video through multiple different programs to get the one program I need to see it right in the correct orientation. What I thought should be a super easy task turns into this huge ordeal and 25 new sketchy data stealing programs on my phone and computer. I even had a few which I can only assume were mining crypto currency with the amount of processing utilization they used even when the camera isn't on.
Well it's at least 1 because I bought a "decent webcam" and while I own a smartphone I don't want to use that for long zoom meetings.
I bought one of those Instagram influencer light rings so I actually look normal on camera instead of like a corpse, and the camera lives on top of my USB C monitor.
There is a reason that for hundreds of years painters and (later) photographers have preferred working in spaces with North-facing windows[0], and that is because it's a simple way of ensuring neutral, even light. My home office is (by design) in a room with one North-facing window.
I use a Logitech C920. Many people have told me how good my webcam feed is. I don't think it's that the C920 is particularly brilliant, it's that if you know even a little about photography, you can do an awful lot to help ensure a good picture.
[0] assuming they were in the Northern Hemisphere, of course
Lighting and composition can do a lot to compensate for a weak camera, but read the post: this isn’t saying “Zoom video looks bad [potentially because of bad lighting and composition],” it’s saying “even high-end webcams are enormously worse than even old, low-end/front-facing cell phone cameras in the same conditions, and it’s not getting better.”
So yes, all things being equal, better lighting helps. But this post is showing that a better (but similarly sized or even smaller) camera helps enormously, and the webcam market is persistently unwilling/unable to give them to us.
One thing to remember about lightning: take your computer monitor into account :)
I have a bright (600 cd HDR) 32-inch monitor. I did a live streaming once (kind of online training) - the lightning was good most of the time, but when I tried to share some bright content on my screen (like a file, or web page with white background) the monitor would be so bright that my whole face would glow like a full moon :). I think next time I will use dark mode for all apps, or reduce screen brightness to minimum.
> So yes, all things being equal, better lighting helps
$63 on a C920, add one window (typically ships free with your house/apartment) plus access to one truly special light source which although it's 93 million miles away you were lucky enough to get a free lifetime subscription to when you arrived on this planet.
Why spent time worrying about what a cell phone camera can achieve in poor or uneven lighting if a bog-standard webcam can do a really good job if you just fix the damned lighting?
The point of the article is that even a shitty old cell phone handily beats a "mid-range" C920 webcam. Yes, lighting conditions are important to photography, but, as the author demonstrates, cell phone cameras are better even in those scenarios. Furthermore, if you live in northern latitudes you get very little sunlight during the winter so a "window" is not a sufficient solution.
What they're saying is that if you live north enough, you're likely to be working during a time when there _is no sunlight_. A window does nothing when it's dark outside by the time your 4p.m. meeting comes around
My window faces onto a neighbour's house. They have painted this area white.
On my webcam, the window is behind me and it picks up this area outside the window and assumes the picture is overexposed because it doesn't expect to pick up large areas of basically #ffffff, and tries to dim the rest of the picture and darkens everything inside. Too much in fact, so it looks like I'm in a really dark room. The Logitech software does not support adjusting this behaviour, only adjusting the picture after the fact like the brightness sliders in your favourite image editor, so boosting them to compensate for the underexposed feed the webcam is providing the computer just results in a super washed out picture with still awful contrast.
I could rotate my office setup but then that introduces other issues, like glare on computer monitors or poor legroom due to radiator or insufficient space for a desk due to the neighbouring en suite cutout. I could close the curtains on the window and then just rely on electric light.
But these are all solutions worse than the problem of my webcam feed being underexposed, so they're not happening.
I wonder how you can work on your screen: the shining white window behind you should give large reflections that strain and fatigue (my) eyes very quickly.
My rule of thumb at workplace is no uncovered windows behind me.
Bright matte screens are pretty good at minimising reflections, plus with my current monitor angling the only one directly opposite the window has me between it and the window.
Man, he's just saying webcams are still shit, not that your pontification about light is wrong. Besides, how many of us get to choose the placement of our windows more easily than the camera we use?
It really needn't be. (Proper) photographers have been dealing with these very same issues for longer than we've all been worrying about how we look on Zoom.
Window Light: The Biggest, Bestest Softbox You Already Own:
"Let’s say your windows are west-facing and you want to shoot in the afternoon. And you’re really terrified of hard light. Keep thinking of your window as a softbox. You just need what they refer to in film production as a “silk”. The same material (ripstop nylon) that is found on the front of most softboxes can be placed in front of your window to turn hard, late day light into a glowing, golden, majestic light bath for your subject’s face."[0]
I’m confused why you’re harping on about windows - which cannot be guaranteed in direction, location, size, or number in any house unless you build it yourself. Further, outside of the equatorial region, sunlight is also not guaranteed throughout the day during the year.
Clearly, your solution is neither “easy” nor optimal. Glad that it works for you, but it’s not for everyone.
I think I understand your point, under ideal or almost ideal conditions most webcams can work fine.
However, most people are not graced with the understanding of lighting and how cameras work. There's lots of opportunity to compensate or at least guide people with better hardware and software. The work done with smartphone cameras is clear evidence of this.
Yes, all cameras look good with great natural light. That's not particularly interesting or useful, because we're not always in perfect lighting to compensate for mediocre cameras that've stagnated for 10 years.
>Why spent time worrying about what a cell phone camera can achieve in poor or uneven lighting if a bog-standard webcam can do a really good job if you just fix the damned lighting?
You are just being purposefully an ass. What about on a cloudly day, how does your sun help you then? What about at night? Not to mention most people don't have the luxury to arrange their home to optimize for picture quality on a fucking Zoom call.
Yes, you can help a shitty camera with lights, but we could just have good cameras. You can still fiddle with your lights until heat death of universe, but rest of us just want cameras that work even if we have to pay a little bit more.
Imagine the mind blowing privilege of not just having a separate home office, but having being able to design it to yoru exact specifications with a window facing north, and telling people to "just add windows". Some people on here are so out of touch.
> I don't think it's that the C920 is particularly brilliant
I used to sit in a cubicle under fluorescent lights, and I often received compliments on my webcam quality when I used a C920e, which I think has the same quality as the C920.
Don't underestimate absolutely horrendous laptop webcams becoming normalized, especially after stream compression.
It blows people’s minds how much a few good practice suggestions change their image. I come from a film background and get requests for help by colleagues and friends constantly. I almost always just reposition their camera and move a lamp nearby to use as a key if they have no window, or as a kicker if they do.
It’s really easy to teach yourself this stuff, it’s not magic. Most people just assume their gear is either too cheap or they’re too ignorant to get it. 5-10min of research and 5-10min of implementation will do wonders for most people.
1) Avoid fluorescent lighting like the plague. You’ll look sickly.
2) Find a nice warm (tungsten/more orange) lamp if your lighting is inadequate. It can be small it doesn’t matter. Place it diagonally from you at eye level.
3) Camera at eye level.
4) If you want to get fancy, get another lamp higher above you (several feet) and place it opposite your main lamp for a backlight. So if your main light is front right, this one is back left. It needs to be weaker than your main light source. This is called an “edge” or a backlight. It gives your hair a soft glow up top that separates you from the background.
The top of your monitor should also be eye level, and slightly angled up, so you never look up, only straight or down. Better for your neck. If you're doing this, then your webcam will also be nearly eye level as well.
Yeah that’s always a factor. If you can’t you can’t, simple as that. It just helps make you look less distracted as well as makes you “more flattering.” Definitely don’t have it angled up if you can avoid it or you look patronizing.
> Find a nice warm (tungsten/more orange) lamp if your lighting is inadequate.
Just buy a high-CRI LED bulb from the hardware store.
It shouldn't necessarily be "warm". In an office only used during daylight, and lit with commercial lighting, a "warm" bulb is going to look extremely orange and out of place.
What matters most is matching color temperatures of your light sources.
Obviously you should find a color temperature that works for your setting, but in general, warm is a safer bet for preserving skin tones unless you are actively fighting daylight bulbs in the room.
Since warmer tones tend to be safer, I recommend it in the absence of more specifics about the environment. But you’re right, it’s not always the best fit.
Also the vast majority of people using these web cams don't have nearly enough lighting. When there isn't enough, the software cranks up the ISO to ridiculous level which gives that distinctive "webcam grain" look thats awful.
And working under insufficient light conditions is also likely worse for your eyes. So people would do themselves a favor working in brighter environments and also wouldn't have to worry as much from their webcam setup.
For me, in bright and evenly lit rooms (so no sunshine but artificial light), most dedicated webcams tend to produce pretty good images.
Also, my experience with the 920
Is that it performs well (better than others I have) in low light. None of the samples from the article were low light. Most folks are in dimly lit conditions, and that’s where these tend to do better.
Fundamentally, webcams suffer from the same problem as n95 masks: prior to the pandemic, everything was geared towards cheap, it just wasn’t a huge market. To invest in retooling would be a risk, there is no guarantee the market will be there.
People are throwing a thousand dollars at phones with a significantly better profit margin. Webcams will NEVER catch up. He pretty much says this at the end of the article.
Taking an inexpensive used photo camera (of the regular mirrorless handheld type) and attaching it to your computer for video conferencing could be the most affordable solution.
With a North facing window, where does the camera sit to get the desired result? Asking because I have north facing windows only and I'm wondering if I could facilitate them better
Umm, it really depends where those lights are relative to the camera.
None of us look our best when light brightly from directly overhead, or from one side, or - if you're really after that Dracula look - from directly underneath.
The sun is a unique (free!) light source, particularly if you're able to get access through a North-facing window. It's quite costly (time and money) to get anywhere close to replicating that effect using artificial lighting.
Exactly. I put couple of Ikea Torsbo(? or something similar, can't remember the exact model) next to my monitor and they provide nice soft light to the face, which amazes some people. Literally spending 20€ on some cheap tabletop lamps would improve the quality dramatically for most of the people. I don't even have any fancy cameras, just a Logitech Streamcam I bought from some discount sale.
> I use a Logitech C920. Many people have told me how good my webcam feed is.
What's your secret for overcoming the C920's shitty color reproduction where the auto white balance makes everything extremely blue? I hate mine and am thinking about returning it as defective.
> What's your secret for overcoming the C920's shitty color reproduction where the auto white balance makes everything extremely blue?
You can disable all automatic settings and then you can manually configure things however you want based on your environment. The only issue is every time you reboot it tends to get reset.
My home office faces north and I do look alright on a webcam (T470 and now some kind of Thinkbook equivalent), but the dim light all day gives me SAD. I'm on the lookout for a few 4k lumen LED panels
Indeed, if people cared enough, they'd be able to get amazing results out of any hardware.
I also have a window that faces North, I use the rule of thirds to better place myself in front of the camera, I only do video calls during the day, and I have a cat that can distract people at key moments.
You can use you phone's camera as a webcam with Reincubate's Camo App. This required a wired connection iirc.
You can also use OBS Studio along with a virtual camera plugin to use any device which can output directly to your computer.
But most of these solutions do not work on Safari or FaceTime unless you manually modify the app.
Now, Apple is going to soon introduce their "It Just Works" solution with the next release of macOS and iOS. You will be able to use your iPhone's camera as a webcam wirelessly with your Mac by just sticking your phone on the back (Apple is partnering with Belkin[0] for this stand) [1]
I personally don't care about the camera as much as I care about the sound quality.
Apple introduced a new API in macOS 12.3 to create Core Media IO plug-ins that run out-of-process, this kind of new plug-in works in Safari and FaceTime too (Camo already uses the new API).
I mean, phone cameras will literally always be better than laptop cameras. They're just a lot bigger.
Edit: I should clarify this a little. From my understanding, camera quality is pretty overwhelmingly limited by lens quality. Better lenses require a thicker / deeper camera housing, which is hard to stuff into the top lid of a laptop. Phones are "always" (although that might not be true forever!) going to have more space for bigger and better lenses.
There's very little clearance in that direction, and the clearance which there is is necessary for the screen / lens not to touch the keyboard / case and get scratched.
So if you have a bump on the inner side, you also need to have a notch in the topcase.
And unless you move the webcam something weird (an edge or below the screen), you need to have that notch through or below right below trackpad, which is less than optimally comfortable before you even consider that the accumulation of crap in that notch can then damage the camera lens.
Would it be so weird? I mean: top corner/edge is less of a problem than bottom (nostril+big hands)
Assymertic? let's go nuts: two webcams, one on each side... suddenly, no notch, double the amount of light, stereoscopic vision. Combine that with some software processing and probably two very cheap sensors could produce decent results without bumping the price as a thin and tiny high quality sensor would.
I have an Apple studio display. That thing is pretty new, thick and has an A something chip running iOS inside.
The camera quality is shit. Just garbage.
I don’t understand how they manage to do it. I understand your comment about laptop screens and it makes sense however none of the third party external webcams and even a non space constrained Apple webcam performs ok.
If Apple released a notch in the bottom side of their macbooks for a camera bump to fit in, everyone would be calling them visionary geniuses and they would use it to jack up the price 200 dollars more.
They naturally are because there's just not much there, it's just an LCD.
Making them thicker means you're making them heavier and less rigid and full of nothing for 99% of their volume, making the entire laptop less wieldy (as it gets much thicker).
It also means the hinge can't go as far back as the lid is now in the way (forget laptops which sit open flush). Or you have to design a completely novel (and much more expensive, and faillible) hinge system which better supports a thicker lid e.g. the Surface Book's fulcrum hinge, except instead of that thicker lid being a computer it's just air, so you get nothing for that expense and inconvenience.
Hm. Interesting: you could put the battery in the screen making it quite a bit thicker, and then have the base thinner, that would change the balance though and it might not be as stable when sitting open on a desk. It'd be great to have a 20 hour life laptop though.
Or we could rearrange cooling so fans stay on the bottom of the machine with heat pipes leading up to behind the screen where the main board is
Or to keep the thermals totally on the bottom just move the GPU and CPU off the main board and put them underneath with the rest of the main board behind the LCD
This allows for a larger battery with more run time, better camera, and it should keep the balance of the machine while keeping hot stuff away from the LCD
Laptop "lids" are a lot thinner than your average phone, they might be able to use a plateau like at the back of the phones to house it but I'd guess if there would be a simple solution that they could "just" do it would've happened already.
I'm not saying it's impossible, I'm saying that it's not a "just" drop-in replacement that you can do from one day to the next. Don't you think they would've done that instead of engineering some ugly phone holder solution for the new Continuity camera?
Apple is always searching for ways to get consumers to buy more of their hardware, so this does seem like a great way of entrenching users a little bit more.
What I didn't think about in my original comment is that their new solution makes use of two of the phones cameras (face view and the keyboard view)... I'm not sure consumers would be okay with THAT much hardware real estate being taken up by the addition of a wide angle lens (which is a really cool feature, albeit a bit is a gimmick for most). Though I'm sure Apple marketing could still make it generally desirable if they wanted to.
Designing a laptop by sacrificing portability for camera quality is absolutely not rational design. No one is walking around taking photos with their macbook.
It's only purpose is for the occasional video chat which, until recently, most people would use very rarely.
Sacrificing portability for camera quality in a laptop would very much fall under irrational.
A camera bump could easily be added and with a redesign of the palm rest/touch pad area there could be a recess or slight curve down to fit, you're absolutely right
> I mean, phone cameras will literally always be better than laptop cameras. They're just a lot bigger.
Sony RX0.
The things with webcam is nobody want to spend any substantial amount of money on a third party usb webcam and nobody is willing to pay his laptop 10 or 20% more because it has a quality camera inside.
The 1080p laptop webcam that's in Apple's latest models is quite a decent camera that benefits from the ISP in Apple silicon. Even the ancient 720p camera hooked up to that ISP managed an incredible amount of improvement over the Intel models.
I'd also venture a guess that many other laptops' built-in webcams can outperform some of these dedicated units, not just the ones that are Apple's iPhone R&D beneficiaries.
"Mic audio is always trash" – again, try a newer Mac notebook model. Apple has been investing serious R&D into laptop microphones, and I think competitors are taking notice and making their own improvements.
It might sound crazy to buy a second phone for this kind of thing, but I do think there are people who will get a better value out of actually just buying a second iPhone as a webcam/streaming camera compared to buying a dedicated professional camera.
Unlike a mirrorless/DLSR camera, you can also acquire smartphones with deep carrier discounts and long, zero interest financing. So, presumably, you could buy yourself a new phone, and instead of trading in your old one you could use that as a streaming webcam.
Don't forget that iPhones aren't just $1000 luxury phones, Apple's cheapest new model is $429 and will stomp all over any dedicated webcam on this list. If you jump on the used market you can buy something like an iPhone 11 or XR for under $300 or $200 and still have a really solid webcam (as a bonus, you don't have to care about the condition of the battery).
External cameras like the Logitech 920 tested in the article have more than enough thickness to have better optics and better electronics than the cameras in phones.
Maybe this is out of ignorance, but just sort of assumed that’s how they keep it so cheap. Also the cameras on phones are arguably why people buy the phones they do at this point, or at the very least it’s a huge consideration given how frequently we use them. They also aren’t just for selfies and FaceTime or what have you, they’re also meant to be photography tools. They’ve completely replaced small consumer point and shoots. A laptop camera/webcam isn’t and can’t be that.
Internal cams in laptops are considered commodities by the laptop manufacturers: they need to put any HD webcam in their product or no customer will buy it, but nobody picks a laptop model just because it has an excellent webcam.
Could you please review the site guidelines and stick to them when posting here? You've unfortunately been breaking them quite a bit, such as with snark, flamebait, and ideological battle.
What we're hoping for on this site is curious conversation that remains thoughtful across differences. If you wouldn't mind reviewing https://news.ycombinator.com/newsguidelines.html and taking the intended spirit more to heart, we'd be grateful.
Thats a good question. The answer is no, for two reasons:
(1) there's no clear way to distinguish "absurd claims that have no basis in reality" from statements that are simply false, and
(2) there's no clear way to distinguish false statements from true ones. We don't have a truth meter.
In the absence of clear criteria, what would we do? Decide what is true ourselves? Not possible—we're ludicrously unqualified, and the community would never support it. Also, I shudder to think of the karma one would incur.
Another factor is community. It would not serve community to ban people, or kill posts, for being wrong. A true community allows people to be wrong—belonging is about relatedness, not correctness—and a robust community would withstand wrongness the way a robust immune system withstands pathogens. I'm not saying that HN is either a true community or a robust community, yet, but that is the aspiration.
I think he means in order to fit a camera you need to increase the thickness of the laptops lid. Whether or not this leads to double the weight , idk but there is definitely a weight increase to fit the camera.
If they did mean it that way it still doesn't make sense to me. Where is the extra mass coming from? Are they just thickening the whole lid and fill it with solid lead? Mass comes from matter, if there isn't matter than there isn't mass. Adding a tiny camera, even if they did have to increase thickness a tiny bit of the entire lid it still wouldn't even come close to doubling the mass of the whole thing on an aluminum body laptop
It's simple math. To hold the camera housing of an iPhone camera the thickness has to go up which means the entire top lid needs to become thicker. This will easily double the weight of the laptop.
> Now that I’ve used the webcam for myself, I can say that it’s woefully impractical. In order to actually have it looking at my face, I have to tilt the entire laptop up due to the fixed angle of the camera. It’s a cool idea, but probably shouldn’t have made its way to an actual retail product. People are likely to be super frustrated trying to use this webcam
Only if that means using twice as much metal - I would expect instead it'd mean adding a thin strip of metal around the circumference of the lid. That would add a few grams, I'm sure, but hardly doubling.
Webcams are good enough for their purpose which is video conferencing and calls. In the end your video is likely going to be encoded in like 360p low bitrate. I personally prefer not being in razor sharp detail on calls anyway.
+1 to that. I don't want to be in ultra HD in meetings. I mean, I also don't want to look like a character from a 90s CD-ROM game -- but that's a matter of bandwidth, not the camera.
In a way, 720p or below is a non-vain version of applying a filter, smoothing imperfections away.
Don't think this is actually true. We stream high definition video regularly nowadays. It's not a question of resolution, but of other qualities of the recording, the way differences in lighting is handled etc. Those imperfections are not smoothed away by compression.
To your point, I've seen some modern webcams accentuate extremely minor facial blemishes to the point where observers actually notice it and call it out, concerned that the person is sick or has had some sort shaving incident. I would think compression would make that even worse.
Mine does this. It makes me look drunk. People who met me in real life after lockdown were surprised I looked normal. I keep meaning to find some software that will turn the red down.
I don’t know how much time you spend on zoom calls but if both sides have genuinely good connections and you have a good camera and lighting, you can get fairly high def video across. And it makes a big difference, especially if you’re 1-1 and want to make a proper connection and read their expression.
Big meetings are the most worthless meetings anyway. It's 1:1s or three person meetings I find most productive, and where I care more about video quality.
> good enough for their purpose which is video conferencing
As a hard of hearing person, the main reason I'd bother to video conference is so I can see a person's lips to understand them better. The vast majority of practical set-ups do not presently make that possible. Some are so bad with the compression and framerate that sign language gets substantially garbled, too.
You probably know of this already, but in some software you can have speech to text live subtitles. I hate to advertise for Google but their Meet thing is the only one where I know this exists. The quality is slightly worse than their YouTube auto-subtitles (and even those are worse than my own hearing as a non-native listener), though perhaps that's because the input is also worse and it needs to be realtime? Not sure, but if you're having trouble understanding, getting 80% as text plus your own hearing and lip reading might be a big aid.
Microsoft Teams has live transcription feature too unfortunately if you don't speak in a native or near native US or UK English accent the results are quite bad.
I'd remove 'UK English' from that at least - at my previous company we joked about how completely unusable it was: almost everything was transcribed as 'Samoa'; it also liked 'certainly Christmas' and other odd phrases that certainly weren't being said.
That was actually an assumption on my part. Most people in our team are not native English speakers and it does a really bad job. Transcription text at the end of the meeting is pure comedy.
There honestly didn't seem to be any such rhyme or reason to it - it was way too frequent. Often replacing a whole sentence. (Not that those either side would be recognisable enough to see that it was a sentence between them being replaced.)
The first time we saw it there was a good lot of 'what the hell is this' before we even realised it was supposed to be a transcript, it was that bad. I'm not talking about just the odd bits replaced with weird words/phrases - the whole thing would be utter nonsense.
(And this was in Cambridge, not to say everybody was local (I'm not), but still, predominantly 'standard' not-straying-too-far-from-RP accents, it's not like it was some difficult niche regional accent & dialect that it couldn't understand.)
We use it as a joke internally. It requires plain English, spoken slowly with a generic US accent (even fake) - anything else is proper bad. To be fair it has improved nontrivially, though.
As others have mentioned, Microsoft Teams also has an auto-generated Live Captions feature. It supports creating live captions for a number of spoken languages [0].
Zoom also has the feature, they call it Live Transcription [1].
While such features can be helpful, using them is rather different than lip reading, particularly while you're in a conversation.
Full ack on that last part. I got a new laptop from my employer a few months ago and the angle is so wide, I have trouble not having something like a drying rack or other unprofessional stuff in the background. It's still on my to-do list to look for a way to crop the image, and manual quality control would also be a welcome feature.
That's not to say that I don't see the use-case for good quality. The "360p ought to be enough for anyone", as you say, really is fairly crappy, and if you're doing some announcement for an audience you care about, where you're in full view on everyone's screen, it would be nice to have a bit better quality than that.
(Or for science/hacking: I use my phone camera for a ton of things from capturing the night sky to the ~1000 fps slow motion feature. If webcams had similar "gimmicks", I'd probably make use of it, also because webcams are connected to a machine where coding is a lot easier than on a phone, so I can more easily do something meaningful with the image stream. But I realize I'm the outlier here.)
> It's still on my to-do list to look for a way to crop the image, and manual quality control would also be a welcome feature.
OBS with virtual camera. Add the camera as a source, scale and crop to your heart's content, click 'start virtual camera', then launch your videoconferencing software and choose the virtual camera.
Thanks! I already glanced at my options and this one seemed like overkill, but yeah it does sound like the go-to solution that everyone uses. Will be installing this some time soon, thanks for confirming this is the way to go :) (Also to the sibling comment)
On linux you may be able to set it up with fmpeg and v4l2loopback (create a v4l2loopback device and have ffmpeg grab your input, crop it, and output to that device).
Maybe less overkill, but probably more involved to set up though so OBS is probably your best choice unless it is really too heavy.
You could use OBS studio to crop the image. It takes your webcam as an input and creates a virtual cam that works with every conferencing software I've tried. (It can also use your phone camera with the aid of the DroidCam OBS app.)
Yeah I kind of agree. We're years away from when most people's videoconferencing quality is close to C920 quality. Audio is much more important. The C920's is pretty crap. You can easily buy better mics, but unfortunately most people just don't care about the quality of their own output.
I think part of the problem is that it's quite difficult to know your audio/video sounds/looks crap to other people. For most users they won't have a clue until someone says "I can't hear you" lots of times.
Zoom doesn't even have a way to test your connection.
Zoom is the test of connection itself. The way it handles choppy link with slowdown then speedup technique to mask the issues, balances audio over video priority etc, just amazes me. One of the best videoconf software I used since 1990s which works in really shitty conditions (remote LTE, HSDPA).
Well, Zoom dynamically adopts to whatever connection pipe you have, you get "Your internet connection is unstable" when the net is really really bad - several seconds pings and packet loses etc. If you see it too often then the test is failed :)
But compression looks so much better when it has decent source material to work with. It's the crappy noisy stuff it screws up on because it views the noise as detail and tries to preserve it, at the expense of details you do actually care about.
This is why movies still look really good even at surprisingly low bitrates (well one of the reasons, the other is of course unlimited time and lots of budget for tuning the compression which realtime applications don't have)
If you tweak some zoom settings you can get a slightly-less-shitty bitrate, and people constantly comment on the quality of my MILC-based setup (APS-C camera with a cheapo chinese prime f/1.2 lens). This is vs the $200-ish logitech external webcam that is standard issue at my company.
Agreed. I set up a really nice webcam by using a Mirrorless Full Frame camera and my wife refuses to use it when she sits at my desk. She doesn't want her coworkers to see her with that much detail and clarity.
By the way, just using a Sony or Panasonic mirrorless is fine, you don't need to buy a custom high-end webcam, and you can use those for taking pictures and videos when you're out on the weekend.
At the end of the day, you'll mostly end up as a glorified thumbnail in a small corner of a, let's be realistic, 1080p screen, next to a bunch of other people. No one in the call gains anything by watching each other, it doesn't matter. For this case, just get a good microphone (even a 30€ Behringer sounds much better than whatever it is most people use).
Another +1. I got a HP w100 480P 30 FPS for this purpose. It serves my purpose best which is interviews and screen shares with friends. I can be seen and recognized and save on bandwich.
Image quality is top notch and the ability to Zoom in/out using the zoom lense is nice. You also get some nice background blur, depending on your used lens. For power there are modified "batteries" allowing to plug the camera into AC for power.
That's a good idea actually, especially older DSLRs are quite cheap on the used market.
The problem is I'm more of a Nikon guy and I don't own any Canon lenses. But my Nikon body is an ancient D50 which doesn't have any of this fancy webcammy stuff. Or even video recording for that matter. And I don't know if newer Nikons have webcam features. But thanks for the tip, I'll investigate.
Even that cam would provide excellent images though with its 5mpx sensor.
Nikon[1] and Sony[2] also released software during the pandemic that turns any of their video recording DSLR/MILCs into webcams. I think a D50 might be a bit too old though since it doesn't have video recording.
Thanks, I'll have a look. But yeah I'll need a newer cam for sure.
The problem is that most of my lenses are 'screwdriver' models so if I upgrade I need to take one of the higher end models which can still drive this. This feature was removed on all the lower end ones over time.
If you have, like, 1960s Ai Nikkor 50mm F1.4 with a "crab nail", and if you don't need autofocus or program-auto aperture, they can be put onto basically any cameras by means of a passive adapter.
I have had no heating problems whatsoever. Mostly its been in use for 1-2 hour sessions at a time, occasionally longer. I do turn the camera off in between though, so it's not turned on 24/7.
Most annoying is a bug in the Canon webcam driver, if the camera is switched off, it displays a static image, which somehow fully occupies a single core on my machine. I would have thought encoding a still image into a video stream should be doable with less cpu cycles.
This is a confusion of terms. By "recording" people can mean both writing data to the card and activating the sensor to receive information. The latter is what overheats a camera, and the process is the same for both sending data directly to HDMI or recording it to a card. Some older sensors on Canon digital SLRs would actually begin to burn themselves out just by having the screen display what the sensor was receiving (also known as "live view") for more than about 10 minutes.
The lack of AF is a feature, not a bug. I replaced my AF lens with eye AF with a manual focus lens on my work webcam. The DoF zone indicates to other viewers if I should be conceptually "in focus" at that point in the meeting (i.e. I am speaking or directly engaging with the speaker). Otherwise I recline and go slightly out of focus.
I had a Logitech C920 that died a couple years ago and when I went to find a better replacement I ended up buying a... Logitech C920. I couldn't find anything that seemed significantly better that wasn't a lot more money. I regret the purchase, but I'm glad I didn't just spend 3 times as much on a still-not-that-great webcam. I was thinking of finding a second-hand digital camera on eBay, but maybe instead I'll see if a friend has an old iPhone.
Though you'd think with the WFH revolution there would be at least one company out there making a high-quality purpose-built webcam.
Yeah it’s wild how much of a vacuum exists in this market and how bad it is.
I have the same webcam I also bought the lg meetup for a larger room and it sucks - it especially sucks considering the price.
The Opal is a weak attempt to improve this, but includes a mic I don’t want (lots of decent separate mics exist). The mirrorless setup looks great, but with latency, capture cards, over heating, external battery packs, required camera arms etc. just a huge pain.
I wish apple would just make a good one again since nobody else seems capable of it.
Logitech dominates the market, and it dominates it by marketing spend, not on making a better product, since they know that the average consumer will never know the difference.
I bought a C920 on the basis of many shilled reviews, and I kneel to the power of their marketing department every time I turn that piece of junk on.
I use the bad Mac onboard for almost all work conferencing because it's right there and doesn't require me to occupy a port on the laptop for in the end just different bad video quality.
I guess I don't understand the mechanism where by you use an android phone as an external webcam. Guess I never tried just plugging it in via usb and seeing if my OS picked it up...
There are quite a few apps that let you easily do that. I use DroidCam which let's you do it over your local network as well, reducing the cables running to your PC.
I wish webcam manufacturers would focus on sensors and optics instead of rubbing lipstick all over low quality hardware with an undeniable porcine quality.
Fixing it with AI is nonsense. Fashionable nonsense but still nonsense.
I have spent lots of money on various webcams. Including cameras that promise “AI”. The truth is not even my $1000 “AI” webcam gets close to my aging iPhone 7. As a video conferencing camera it is an over-priced, over hyped-piece of junk. The damn thing can’t even do basic white balance and has no control to set it because “it is an AI camera”. I either look like I’m about to die or as if I have a sunburn. And after countless firmware upgrades, it is still horrible. The expensive camera has one thing going for it though: it has decent microphones. That’s actually more important than good video.
To anyone who makes AI webcams: please give me manual controls and focus on image quality instead of marketing. Don’t be clever. Hire someone who obsesses over image quality and is a decent photographer to judge the visuals. And put a good microphone system in the thing.
Totally with you on the microphones. I use a really decent mic for video calls now, and everyone says I sound like an NPR announcer. This is a good thing. It's really draining to try and listen to people who have awful sound quality when they're talking. I wish everyone would invest in a decent mic. Just a basic podcast mic is a million times better than using your earpods or something similar.
I upgraded my video and audio setup during the pandemic since I figured even at up to $1k it was cheaper than the office space it was replacing.
After about a month of research I was surprised to find how difficult it was to evaluate options and how few of those options were plug-and-play.
I had assumed the explosion in livestreaming over the last few years would mean best practices would be easy to find and great high quality cameras purpose built for streaming would exist, but didn't find that to be the case at all.
Ultimately I took a chance on a heavily discounted open box ZV1 but it wasn't until I married it with a 4k capture card (USB streaming was meh), a key light (lighting matters so much!), a mic (& arm to hold it), AND an ali-express battery-to-dc connector (so it could run all day) that I finally hit the sweet spot of "clearly better" and "easy to use all day".
This is my exact experience, except with an A6000 that I already owned. I wouldn’t recommend it for most people since like you said there’s a lot of accessories required to get a good result, but for people like me who enjoy iterating on their home office setup it’s a fun project.
My favorite feature of DSLR/MILC webcams is the depth of field effect. It’s pretty subtle, but adds an air of professionalism vs the default zoom background blur feature.
I’ll probably get key lights at some point but in the mean time I’m using regular floor lamps with Hue bulbs, which does let me do things like play with the light color. I’ve found a very slight purple hue in the background really makes the foreground pop.
The logitech C920 can put out incredible video, if you have access to every setting. For some reason logitech puts the training wheels on and doesn't let you adjust every single setting like gain. There was an incredible app called "webcam settings" for mac that let you refine that webcam like putting a dslr into manual mode. I was able to get it so dialed in you thought it was a top of the line webcam. That software unfortunately is gone but sometimes it's not the hardware, it's the software. That said I refuse to pay monthly or yearly for software to use my phone as a webcam.
On linux you can use v4l2-ctl to fine tune the webcam settings (focus, gain, white balance, etc).
The C920 has really poor firmware, none of the auto settings work well. Auto focus is particularly bad.
I wrote a quick script which hard locks all of the settings of my C920 to the optimal values for my office conditions.
The low light performance of the C920 hardware is quite bad, but if you have adequate lighting and manually configure the setting you can get a pretty good setup.
There is an industry that packages high quality Sony image sensors in packages with C or Cs mount lense compatibility and native UVC support on Linux/MacOS/Windows. Primary uses seem to be industrial cameras for manufacturing quality inspections, circuit magnification for repairs, installation onto telescopes, etc.
Prices range from $40 to multiple hundreds for extremely high quality sensors.
This sketchy site has comparisons of the sensors by area and signal to noise ratio.
> Using an iPhone as a webcam is easily the best option without dropping an excess of $1,000 for professional camera gear. Using the iPhone you already own, or a recent hand-me-down, a moderate investment in lighting, and Reincubate Camo, you can get excellent results and none of the frustration and hassle that accompanies standalone webcams.
I don't have an iPhone. So if I want to buy one that is better than any of the webcams reviewed, I'm spending a couple hundred bucks at least. Then I need to spend $50/year or so on their Camo software.
For a couple hundred bucks I can get a used Sony Alpha, OBS is free. And arguably with effort I can make the Sony look much better.
This is a stretch, of course. The average person isn't going to fuck around with OBS.
Let's think of it another way. For <$100 you can have a webcam that works. It's better than nothing. Is an iPhone plus TFA's software better? Yes. Is it better proportional to the cost? Absolutely not. I'm not even convinced it's twice as good. I have some colleagues who use Camo, happily, but as a viewer in a Zoom meeting, I don't care. They're usually the size of a postage stamp on my screen. Most folks can't even upload full resolution streams of their cheap webcams in real-time.
Camo is cool, it's great that you can use your phone as a webcam, I love it, but "Why $90 webcams aren't as good as a $1000+ iPhone" would be a better title.
For what it's worth, with the new macOS 13 and iOS 16 releases, you don't need Camo as it's built into the OS.
Of course the rest of your points still stand. The article does hinge on the user having an iPhone, and the comparison isn't entirely fair.
However I think the point of the article is to say that many people have devices on hand that will give better results than splurging on webcams which won't provide much gain
The simplicity of a normal webcam also has a huge value.
I hate waiting a few extra minutes at the beginning of a meeting while someone tries to get their complex webcam setup working. Some people can plan ahead and make it all work. Some people are always fighting with audio settings or restarting OBS or adjusting their webcam and so on. Wasted time for everyone involved.
I’m not looking forward to the new era of people messing with their iPhone mounts to their laptops to get the camera juuust right as we start every meeting.
Let’s just use the built-in webcams and get on with our meetings. Or if you want to use a fancy setup, you must have it all ready and tested before the meeting starts.
> I don't have an iPhone. So if I want to buy one that is better than any of the webcams reviewed, I'm spending a couple hundred bucks at least. T
To be fair, he reviewed an iPhone so he recommended that. I think the advice is really more like "use the existing high powered camera you have in your pocket".
Unless someone is living under a rock (at least anyone in the use-phone-as-webcam territory), they probably have an Android phone if they don't have an iPhone.
Any decent Android phone should work too. I'm sure there are many apps enabling this. While they probably have worse image quality than iPhone, they'd definitely be much better than those cheapo webcams.
OBS camera won’t work in many MacOS programs nor will GoPro Webcam. It should be noted that the M1 MacBook Pro has a camera which is noticeably better than it used to be.
With gopro, it depends on the model and how you connect it.
I did use gopro (an older one, hero3) as a webcam, and it did work, to a point. Hero3 has hdmi output (not clean, but it dissapears after few seconds), so connecting it via hdmi capture dongle did work as a standard usb webcam in MacOS. The picture was great, the latency was under 10 frames when doing 60 fps capture. The problem was, that it could not charge and record at the same time, and it didn't give any signs of life when the battery was empty (other than signalling the charging). You could not even leave it powered, so it would charge when empty - it would notice only when you press some button, that it is empty and start charging. It was annoying to make sure the battery is charged enough when needed, so I just got an logitech webcam; the picture is worse, but just works, all the time, without babysitting.
With hero3 it is not a problem, see the very first sentence.
It does not stream over usb. There's no app to simulate webcam on the host operating system. Hero3 outputs hdmi, you use some hdmi-to-usb dongle, which are plain old USB UVC class devices and every operating system out there with usb support knows how to handle them. So if a logitech or whatever webcam works, this one will work too.
However, I see that hero9 has no hdmi out. That's the bummer.
A7 is definitely a better pick but I use a A6000 as my webcam and haven’t had any issues with overheating.
My main annoyance with the A6000 is can’t charge and film at the same time so I had to buy a dummy battery pack power adapter which works but looks janky.
Which comment are you referring to? I can't find it. Whoever it is is wrong though; fuji stuff works better than anything else I've tried, except for dedicated cine cameras.
Marshall, a company that makes miniature video cameras used in broadcast has a great USB camera that is UVC1.5 compliant, you can change lenses, it's not that big, and has great image quality.
Honestly if I were to buy another webcam beyond the cheapest "yeh that'll do" one I could find, the main thing I'd be looking for is microphone quality and features. Sure, I could and have brought a USB mic, but a webcam is pointed directly at you (like a shotgun mic, which seems ideal for this kind of thing), whereas the bulk of a USB mic can be harder to position in a way that doesn't block your forward monitor. If it had a configurable mute shortcut, with a little light on it to show its mute status, I'd buy one in a second.
In fact, saying all that I don't even think what I want is a webcam, I want a USB webcam-mounted microphone. I do remember seeing one once, but its very old now and has none of that mute functionality. Wish someone would make one again
I don't _want_ a good webcam. There's really no need for my colleagues to be able to count freckles on my nose. All I want others to see is my general contour and expression on my face. In fact I wouldn't use it at all if it weren't mandated by company policy and no one used one until after HR drones started complaining about it. We had one guy join the weekly call with some crystal-clear, HD cam once and entire office started joking about his alleged "online side gig".
Lighting and audio are more important than raw image quality.
Getting an expensive camera to send heavily compressed 320p low bandwidth video doesn’t really improve a whole lot.
Honestly the #1 improvement most people need to work on is speaking clearly and loud enough. #2 would be having literally any light source in front of them rather than being purely backlit.
> Lighting and audio are more important than raw image quality.
Exactly! A decent USB microphone costs something like $100 and will improve your calls more than anything. I can tolerate if your image is a bit dark and blurry, but it's super hard to pay attention if you sound like you're calling from a two square metre tiled bathroom through a 1950s telephone system.
The selection process doesn't have to be super complicated that satisfies every audiophile. Just google what streamers on Youtube and Twitch use and choose something that fits your budget. For example Blue Yeti and Audio-Technica AT2020USB are popular, relatively cheap, and super simple to use.
And as mentioned the best thing you can do to image quality is just to improve your lighting.
Just FTR for the active speaker Zoom streams heavily compressed 720p that goes up to 1080p if HD is enabled. It’s low bitrate but still makes a difference in clarity.
This is one those markets that clearly doesn't appreciate picture quality enough. I have at least 6 different web cams and have the same experience as the author that they are all crappy in slightly different ways.
It seems the point-and-shoot camera makers have all the parts and technology available to their teams to make a really nice pro-sumer web cam and given they are looking for adjacent markets I'm kind of surprised there aren't any out there. They all seem to have the equivalent of 'camo' (aka software that turns the camera into a web cam) but none of them seem to have packaged a camera specifically for this niche.
One criticism I haven't seen brought up is that while UVC drivers are plug and play on Windows their settings are not persistent across reboots [1]. I would painstakingly manually tune the exposure and white balance of my Brio and get it to be less than terrible and then windows update would reboot over night and I'd fire up a stream in the morning looking purple. I have softboxes and high-CRI lights too so this was particularly enraging.
The flipside is non-UVC driver are less terrible but less compatible: I've switched to an Avermedia PW513 which uses their own drivers -- this works great for OBS when I stream but nothing that uses UVC works without using an OBS Virtual Cam. Just don't run process monitor and watch what their software is doing to your registry every second.
So the driver system, at least on windows, is part of the problem IMO.
[1] After I switched to non-UVC I learned about a UVC driver restoring utility but I've never tried it.
I still have my iSight camera plugged to my Mac. 'Tis a shame it won't work when I will upgrade to an M-series Mac even though I don't even use the camera for anything.
It definitely does have some overheating issues, but it looks too good. I might use the shell and replace the guts with a Raspberry Pi powered camera [0] if I can actually get my hands on a Pi Zero with the shortage we have right now
> If I could I'd still be using my 480p external 2003 iSight camera through an increasingly ridiculous series of dongles
Why? 480p does not sound better than today's mediocre webcams, does it have some amazing light balance or face detection for focus or something?
I would want to do this as well for shits and giggles, a 2003 webcam with 4 dongles being nearly on par with modern webcams is fun / hackery / a conversation starter, but I'm curious if it's more than just that. (My own ~2005 webcam simply connects with USB and works out of the box on any modern hardware running Linux. I suppose I didn't and don't think different enough.)
Great features too in the magnetic monitor attachment and the fancy aperture to close and turn off the camera.
The low resolution is somewhat appealing, in a retro way. If I'm going to be using a crappy external webcam it's cool that the image looks like an early 2000s PowerBook.
The FireWire iSight camera's image looks like ass but it's good enough for my potato face.
The camera's bundled FireWire 400 cable connects to a hard drive enclosure that also has a FireWire 800 port (the camera's cable is detachable so this could be replaced with a FireWire 400 to 800 cable). The FireWire 800 cable runs to a Thunderbolt 2 dock with a FireWire port and the dock's Thunderbolt 2 cable runs to a Thunderbolt 3 to 2 adapter connected to an Intel MacBook Pro.
I like built-in, irised camera cover though by twisting it open/closed, I often shift the position slightly. The camera's stand sits on a box wedged between the wall and my monitor; I wish I had a slightly smaller box so the camera was a little lower, closer to the top of the monitor.
I had all these parts so it was a zero dollar way to get a camera above my external monitor. I might try the free version of EpocCam with an old iPhone (too old to work with Apple's forthcoming Continuity Camera feature) but that's bound to be more fussy to start/stop.
For a REALLY good webcam just get a Pi Zero (v1), Pi HQ camera, a good C-mount lens, and flash this firmware: https://piwebcam.github.io/
which makes the Pi Zero appear as a regular USB webcam when plugged in, and also gives you a telnet thing to control camera parameters.
It's particularly nice because it doesn't mount any filesystems in RW mode, so you can just plug and unplug it as needed.
There are also some enclosures you can 3D print for this combination.
Although this was a fascinating article, the quality of webcams is pretty low on my list of things that could use improvement in the teleconferencing experience.
Not good enough for what? All the problems the author describes are certainly issues I'd worry about in photography, but that's not the purpose of a webcam.
The quality of images produced by modern webcams rivals that of professional gear from 20 years ago. I'd say that's more than "good enough."
And if you really want professional photography quality images from your webcam, it is possible to use a DSLR as a webcam.
There are good USB cameras, just not a the price point reviewed here. For personal use (sitting in front of a normal display at a desk), I strongly recommend and use https://www.huddly.com/conference-cameras/one/.
It's a device you likely use every day. Spend the money, get setup once, and move on.
> Originally, Logitech's higher-end webcams, such as the C920, also included dedicated MPEG processing hardware to encode the video signal, but removed it at some point
Anyone knows if this is detectable on the hardware side? By checking the revision number or features using v4l2-ctl perhaps?
Well, i have Logitech C925e, which has MPEG encoding hardware, but as people switched to WebRTC in browsers, it is unlikely to get supported in software (browsers).
1. If you absolutely need the best quality image you can have on a video call, use a mirrorless or DSLR photo camera that has 'clean HDMI out' and couple it with a HDMI capture device (which are now quite cheap for 1080p capture).
2. Most (if not all) video conferencing software applies heavy compression to the image of your video feed. Even if your local preview image shows you in ultra-sharp focus with fine detail, you can almost guarantee that people at the other end are seeing a soft fuzzy picture that looks like it's been upscaled from 240p.
So you have to ask yourself, is my laptop webcam, or the £50 no-name USB camera 'good enough'? - in most cases, the answer is probably 'yes'.
I am hoping that Apple's Continuity Camera "just works" as the camera and lighting are excellent. I was on the hunt for a decent webcam last year and surprised this didn't "just work" already without flakey 3rd party apps, etc.
Look, my work laptop isn't shit, the hardware specs are actually pretty good, much better than my my personal laptop.
Yet for some reason my personal laptop has great camera, display is clear, adjusts well to lighting.
My working latptop camera on the other hand is utter garbage. It's grainy as hell and seemingly randomly radically changes the exposure making me go from vampire in the shadows to a divine, blinding glow I have and ISDN H.320 videophone that captures in 480i and still displays better, though the camera is much more significant than a tiny webcam and can manually adjust setting which for reason isn't an option for many laptop webcams.
I first saw this article years ago, and I'm glad to see it has been updated with some more modern webcams.
I'm also sad to see that the state of the art in this space hasn't changed much for webcams, especially on the Mac.
I do have an older iPhone 7 that I could put to use like this, but I'm starting to become convinced that maybe I actually do need to buy a somewhat newer iPhone device with multiple lenses, and use that instead -- for the same reason that my iPhone 7 is now obsolete for personal use and has since been replaced by an iPhone 13 Pro Max.
As a longtime, happy user of an iPhone 6S (on my third battery, second screen, and second charging port), I am curious as to why you consider the iPhone 7 to be obsolete for personal use. I know that later models have better screens, cameras, processors, etc., but I have a crisp Retina display and all functionality I use seems fine.
The entire reason why I dumped my iPhone 7 is that it can't keep up with the current version of iOS, and I needed something that could. So, I'd always be stuck on a device with a back rev version of iOS, and after a short while, security patches would stop being back ported.
Moreover, my employer requires that I use a device that is running the latest version of iOS and is fully patched, so the iPhone 7 is now simply a non-starter.
As for using it as a webcam, the camera and other hardware is old enough that I don't think I'd like it for that use, either.
So, it'll sit in a closet or on a shelf until I decide what to do with it, along with all the other old iPhones I have, going all the way back to the original iPhone.
This article was written from the perspective of someone who has not kept up with the webcam market at all and judging by the current state of the comments here, neither has anyone else. Products like the Dell WB7022 and the Elgato Facecam blow the previous generation out of the water, and the Dell is even microphone-free, which I much prefer to the compromised microphones jammed into most cheap models. The Dell model is also bog-standard UVC compliant, unless you want to play with face-tracking or HDR modes; these for some reason require a Windows program to toggle, instead of being exposed as UVC settings. The Elgato model is entirely proprietary.
As much as I like the WB7022, it turns out that nobody notices or cares if I hop on a meeting on my wife's computer with her Logitech c922; at the end of the day and for the majority of business needs the cost/quality cusp has been nailed for about a decade, and I don't predict a major shift toward higher quality any time soon. If you need to produce high-quality video, a webcam will never be enough, and if you don't need to shoot a movie, webcams have been fine for a long time. Even in the linked article, all the criticism is from the owner of the webcams, with strong opinions about photography. Not one iota of attention or interest has been paid to whether anyone on the other end of the meeting even notices.
I have an odd webcam given by our company. When I plugged it in, I thought something was badly broken - it was just a giant blur. Messing around with the thing, I noticed my hand came in focus. Turns out that I had to manually adjust a ring around the camera to focus. Once I did that, the quality is crisp and clear, and I’m not moving closer and farther away from my desk, so I can see how autofocus isn’t needed. I think the simplicity of the design may actually be helping the quality in this case.
Mh, I wonder what article author think for webcam use cases... Personally I consider them useless 99% of the time in the sense that there is no need for video. When a video is needed 480p is enough, eventual dropped frames or artifacts does not matter much. If we really need hi resolution it's beyond webcam, something about YT, TV, ... not webcams anyway. If we need to share paper docs (LIM alike) it's better scan them before.
Yeah I always wonder.. Why don't they put those cams they put in phones in a webcam?? Sure they're expensive but so are some webcams. And $200-300 buys you a great compact photo cam. So why not a webcam that needs less components?
It's not even just the sensors, the optics totally suck as well. Even the top webcams often have a huge 'fake lens' that's actually just flat glass and being it is just the usual tiny lens. The logitech stream does this, for example. It looks impressive but it's still the same crap in prettier plastic.
I kinda had to laugh at that odd iPhone clip they introduced at Apple's last keynote. It's basically an admission that their webcams suck.
I know some colleagues that use their phones as cam and while the quality is great it's also super annoying not being able to use your phone when a text comes in or something. I know Apple has some built in tricks to handle that stuff on a Mac but for work I don't always use one (and I don't have an iPhone). So I never use my phone for this. I'd spend 200-300 bucks on a great webcam though. It's just that right now there aren't any.
A substantial advantage to Reincubate's product is the ability to control the camera settings that would otherwise be relegated to some auto-focus auto-lighting software. In a dark room, for example, one can choose to slow the shutter speed way down, trading some motion blur for a better static image. I hope they can continue to provide those features on top of Continuity Camera after it launches with Ventura.
This is something i've struggled with for a while, working remotely for years now, and not wanting to be depersoned by my team never seeing my face.
I tried a logi 920, which i thought was pricey for a webcam, but the image quality is terrible. And every time it starts up, it runs in 60hz anti-flicker mode which flickers like crazy here in australia where we run 50hz power, so i have to start their awful software to change the setting.
Eventually I returned it and got a canon m50. Image quality is good now. I saw some people on the internet saying it could be powered through the usb connection but that turned out to be false, so i have to change the battery every few days which is frustrating. I know you can get a pass-thru 'battery' that connects to a mains adapter, ive yet to try that. Also frustrating is having to reach up and turn it on before a meeting and off at the end.
Next macOS has the feature to use an old iphone as a camera. If this works seamlessly (eg i can mount the phone behind my monitor and it charges itself and turns on and off automatically) it'll be even better.
The webcam market is a niche market. Even game streamers, arguably the most "consumer" like target audience that cares about quality, will often put the image of their face in a corner of their screen.
The simple fact of the matter that the target for webcam seems to be to be either good enough or to be better than the competition. Neither of those are very high bars.
There are some improvements in webcam land. There's a trend towards more higher-quality consumer webcams even if they're ridiculously expensive for what they deliver.
One problem I have noticed with using phones as webcams is that often the image will look distorted if you're not right in the center of the camera. The closer you go towards the edges, the more your face will get distorted in width or in height. This is a natural consequence of how these tiny lenses are able to get such a wide picture so I can't really fault phone manufacturers for this, but it's something to keep in mind when you pick the more obvious solution and just stick your phone to your monitor.
I have bought Elgato Facecams for our offices. The image quality compared to the previos webcameras I had my hands on is on a different level. The sharpening effect was a bit too much but their software can be used and save the profile to the camera so you only have to fix it once. Otherwise the picture quality is amazing and is useable even in badly lit conference rooms.
I have been testing various solutions for semi-professional-looking interviews with remote staff and concluded that a dedicated microphone plus the rear-facing camera of whatever smartphone they have at home is by far the best bang for the buck solution. webcams are too terrible and mirrorless/dslr cameras are too expensive and complicated to set up and use.
I having been successfully using the Anker PowerConf 300 webcam. Using the AnkerWork software on my M1 mac, I find it has excellent autofocus, auto exposure, and zoom and pan capabilities. The cost is $130, but it is often on sale at $100.
I have used it with iGlasses, but that has tended to crash after about an hour. So I use AnkerWork though it is less convenient.
This is a symptom of the race to the bottom in the digital camera industry. Most webcams are built around low cost modules running firmware developed by an OEM who made it good enough to ship and nothing more. They don't have any incentive to improve the features beyond banner specs and their buyers have little influence.
Question for people using Came: isn't it annoying having your phone locked up for the whole video call? No way to even check your notifications or do anything else on it? And constantly having to plug and unplug it, adding more wear to the delicate USB C sensor?
I would find it really annoying yes.
I used camo with a spare iphone that has a broken screen. Despite being quite happy with the image quality results, I found that the experience was frustrating enough that I don't bother using it anymore, even without tying up my main phone.
Every time you want to use it, you need to turn on the phone, open the app, plug it into your laptop, and make sure it's sitting in the stand properly.
It's just a bit too much friction for me to bother with it for day to day zoom calls.
There’s a bunch of advice on improving video calls here: https://www.benkuhn.net/vc/
It touches on webcams. Some other important things for video calls apart from webcam picture quality are:
- latency. This matters everywhere (webcam, webcam driver, codecs, network, graphics pipeline, monitor software, physically switching the pixels) but I would expect webcams to introduce higher latency due to generally not being good
- audio matters a lot (as does audio latency and slowing down audio to synchronise it with slow video)
- some lighting things
- to some extent eye contact will also be wrong with a webcam so maybe the FaceTime style thing where your eyes are ‘corrected’ to appear to be looking at the camera instead of your screen would help
Yes, if you do a lot of business communication, this stuff is important. I talk to way too many people that should know better that clearly don't have a clue about any of this and that end up looking and sounding pretty poorly
It's pretty simple. Most bluetooth headsets are not great, including the expensive ones that Apple sells you. And even if they are good, the bluetooth audio compression is pretty bad. So, you are unlikely to sound very clear. So, avoid that and use wired headsets or even your laptop's build in microphone instead. For optimal results, invest in a good microphone. There's a reason professional podcast hosts sound so good: they use proper microphones. You can sound that good as well. A simple, wired clip on microphone doesn't cost a fortune.
Audio filtering is pretty aggressive these days and that can cause additional issues. For example eliminating feedback and echoes is something that a lot of software tries that can result in your audio cutting out and getting distorted. Solution: wear headphones to listen to the sound. This prevents feedback from having to be filtered out. Also, being in a quiet environment helps.
As the article notes, many webcams are not great in terms of hardware. In practical terms this means low resolution sensors that will have a lot of noise. The darker it gets, the worse it gets. So be in a well lit room and pay some attention to the light situation. A simple desk light turned the right way can do wonders at night. Professional youtubers tend to invest in special lights for a reason: it makes things look good.
Obviously, you need a stable network connection. Doing multiple video calls on a domestic wifi that has multiple family members doing whatever in a room that is around 10 meters from the base station is probably going to result in a pretty bad connection. A few meters can make all the difference. And if you are upload constrained (like many DSL connections are), there are going to be issues. If it's really bad, turn off your video. You'll sound better that way and people will be able to understand you at least.
Finally, pay attention to where the camera is pointing. If you are a sales person that spends a lot of money on their outfit, do you really want the lasting impression to make to be your nostrils filmed from below? Also, clean the room you are in and don't have a backdrop of your domestic crap, laundry, and worse. It just does not look professional to do that. It's not hard but you need to pay some attention. If it's really bad, you can use a fake backdrop or blur it. But be aware that that can look a bit glitchy on the other side. Better to just be in a nice looking place. Also nice for yourself BTW.
My theory is just that video conference applications should inject whatever horrible things you are doing to your image to the other image :)
If your camera shakes while you type, your view of the other person will shake too. If you have a horrible echo, so will the other person. If you are backlit with dim light, get ready for a dim, grainy view of the other person. Low res? Blown highlights? Yep, we can show you that. The goal is to setup the incentives correctly. Most people seem to appreciate a good view of their peer but many can’t seem to be bothered to do the work themselves.
Friendlier alternative: Build classifiers for the 10 most common ‘you look bad’ situations and have the video conference app turn on the ‘dummy lights’ (to borrow from cars) that warn you what’s up and how to fix it.
The poor auto exposure and white balance touched on by this article really affected my webcam’s quality. A lamp in the background of my office caused my webcam to choose some setting that caused my face to be very red and blotchy.
I’m not being vain. People noticed it and it strained conversations. It’s harder to effectively communicate when your audience is uncomfortable by your appearance. Something to think about elsewhere in life.
But I digress. The Logitech software is horrible. I found CameraController [0] to adequately solve the problem. It allows me to adjust exposure and white balance. Now conversations feel more natural at work.
I'm quite fond of a raspberry pi hd camera and showmewebcam. The problem I've got is that the only lenses I've got to hand mean that I need to mount the camera about a metre behind the monitor, which is really annoying. Definitely need a wider angle lens.
*On mac. Kiyo has settings on windows. I had a lot more luck setting up on a windows laptop then using on Mac & separately on linux. The ring light is dimmable and i won't post them, but i think my selfies look better than his.
Zero affiliation with any vendor and it shits me to tears razer do windows only settings and razor's whole software experience it's something I dislike intensely. It's a reasonable option for your consideration along with a blue yeti mic for a Linux based home Office setup where you look ok, your voice sounds sound clear and the money isn't as bad as it could be.
I briefly worked with a guy whose previous startup had been premised on the idea of combining several consumer-grade phone-cameras into a small array and using software to build a far better image than any one of the cameras could accomplish individually, yet remain cost effective because these things are sold in such absurd scale.
I don't know what became of it, but it sounded plausible (at least the computer vision parts that I understood were sane-sounding).
Does anyone know if people are doing this sort of thing? (I'm aware that the expensive iPhones have more than one camera but IIUC that's for different reasons).
During COVID I had to shoot some remote video for a short segment. Ended up putting my Sony Alpha 7 iii and my (relatively awful) Logitech C920 on an Elgato mount setup with roughly the same viewport. Connected the camera via a Magewell USB 3 capture card to OBS (really great software), and made a virtual output so the producer could remotely setup the camera exactly as they wanted. For shooting I just swapped scenes so they would watch the live Logitech view, and recorded directly onto the device. A wonky setup, but the video quality was great.
Getting good lighting was more of a challenge for the next few vids.
Here's a question: why do people care so much about what they look like or how the camera displays them? Personally, I care more about the content being delivered than what someone (including the scene) looks like.
> I care more about the content being delivered than what someone (including the scene) looks like.
IMO, we all want to believe that but we really don't. All other things being equal, a professional-looking setup will trump someone streaming from their messy bedroom unless the content is really that good, which it typically isn't. Conscious or not, first impressions are a thing.
This is the type of opinion that used to infuriate me as a young programmer, as I used to think that good code presented badly would win over bad code presented nicely. I can now admit that I was wrong, and I therefore always invest some time in making everything look as nice as possible. Making my delivery more professional has not shown any downsides yet.
I can’t speak for everyone obviously, but for me, lower quality setups are distracting and make it hard to focus on the material. Good setups “disappear” and let the content shine.
I think we only notice quality setup on video calls now because that one person that cares stands out.
Humans care in general, even if they think they don't. We respond better to higher quality rendition of facial expressions.
Google search has degraded so I can't find it, but I remember a couple years ago that a study showed people subconsciously associate image quality and lighting to status. People with better lighting look richer and of higher status.
Panasonic released a "beta webcam" software for several of their more recent cameras in September 2020[1]. Luckily i have one of the supported models.
Unfortunately, it has bugs, it doesn't support Apple Silicon and it hasn't received an update since.
Hello Panasonic... please fix the issues and release an update, including Apple M1/M2 support!
We just use webcams to say "hi!" and all that stuff, then turn them off, someone has a powerpoint on, and even if someone else left their camera on, noone is looking at them.
So basically, a vga camera would be enough to recognize my face, smile and wave, and after that, it's audio only.
Laptop mics are a different story, especially with fans on high,... and bluetooth headset mics.. or cabled earphones mics... some seem to work really great, and some people seem to be talking from somewhere deep inside a well or even worse, and there is no seeming price/audio_quality correlation.
>Laptop mics are a different story, especially with fans on high
Switch off your own video - most of the work is video encoding, and unless you have dedicated hardware, it'd suck - sorry, blow hard, the fan that's it.
I'd love to just use an old smartphone as my webcam, but unfortunately they really aren't made to be connected to a charger 24/7. If they at least let you charge the battery to 80% and then shut off the battery until the next cycle, or power on with no battery present, this would be a great alternative.
I got a OnePlus 6T here that I'm not using anymore, the camera is great, but I think if I tried using it as a webcam I'd just end up watching the battery balloon through the case (or worse).
The iPhone's camera has more engineering resources poured into it than the other three webcams reviewed added together, it is disingenuous to expect similar quality.
I recently got the Obsbot tiny 4k webcam. Even though 4k won't come through on videoconferencing software the color seems much more natural than Logitech webcams.
If you want a great webcam, then you can buy something like a Canon m50 mirrorless dslr or wire up your phone iphone (or samsung!) to you Pc.
Eitherway you will end up spending close to $1000. Standalone webcam's just dont have the CPU or firmware to create great pictures.
The courious thing for me is why the Webcam on my Mac m1 so bad, when you know the hardware/firmware involved is so closely related to the hardware in the iphone
This makes me wonder whether there's some relatively simple configuration of cameras that can be used to give a basic 3-d type experience. Nothing crazy like the oculus, but maybe something where a computer can use head tracking so that as you move your head, you see a little bit around the other person's face so it feels a little more lifelike.
I assume this would only be relevant for the primary speaker, of course.
A nice device is the Nexigo N970p, it is expensive with about $200, but the good part is that is has very good quality and comes with a remote. All settings are made in an OSD and are stored on the Webcam. It just works, not software drivers or anything is needed.
Only downside is the power draw, it is quite warm. It seems to suck so much power that the USB plug next to it in a dell docking station was not working correctly.
Alternative: Elgato camlink hdmi capture interface with a used Sony A6000 system camera is one way to het better quality. Its about sensor size and optics. A mirrorless system camera has for example APS-C sensor which captures much more light than a typical web cam sensor.
Gphoto2 camera to webcam capture tool if you are runnning Linux. Then you can turn system cameras into webcam given the right camera model.
From a quick look that is about 500€? (100 for the capture and about 400 for the cam on ebay). Also, you probably still want a stand, battery, lighting, microphone...
That is at least 5 times as expensive as a webcam. Is it at least 5 times as good?
If you already have a system camera, that might be a good alternative, but otherwise?
I did quite a lot of research to find a camera with decent picture quality in low light conditions. I do multiple work calls almost every evening. I ended up buying also a key light and a microphone. I think in this age of remote work, it’s the right thing to do for your colleagues to invest in this equipment. It’s kinda annoying when 20 people on the call cannot hear your or see you properly.
Well the first rule in photography is set up your lighting properly. Low light is hard for professionals. Don't expect good results in shitty lighting conditions. It's kind of important.
What is going on in the video of the C920 exposure test? It’s not clear if he’s changing lighting, changing camera settings, or letting the driver change some internal settings, but the view out the window is getting less and less overexposed while his face remains exactly as overexposed at each step. It’s like it’s exposing different parts of the frame differently and all poorly. WTF?
OK, fair enough. Point being that the exposure of his face never changes, but the exposure of other things in the frame does. That’s pretty bizarre, especially given that his face is overexposed.
I am using my phone and Ip Webcam so I can compress the stream directly from my phone and use the compressed stream on my computer as native webcam input via vl4c (or whatever the kernel module is called)
This way I get high quality and high frame rate in a manageable format and my computer can focus on streaming, while my phone does what it was (also) built for, providing a quality video feed.
I use Camo (lifetime $79 license) and an iPhone SE I picked up on eBay for $200.
I've had dozens of people comment how clear my camera is. Many now purchased a similar setup. I used OBS for a long time since it was free, but the quality wasn't close.
If you're in a line of work that requires you to stream it's a worthwhile investment by a great company that just keeps getting better.
I used to use this setup but got tired of turning on the phone for meetings and doing that while it's in its stand or messing up the camera angle a bit. Camo also sometimes wouldn't connect right away which is especially annoying when I'm already running late. Did you find a way to mitigate those issues?
On timing – I just had to start planning a bit better. In worst case I'd start the meeting without video and turn on a minute or two in. Well worth it for me.
I bought an iPod Touch (~$100) to use with Camo Studio. It works great. Lets me keep my laptop lid closed all the time which I prefer, and the video quality is much much better than the built-in camera. The software costs a little but it's well-made and I don't mind paying for stuff that does what it's supposed to do.
That's fair, the current iMac is only 1.15cm thick but that's much thicker than an iPhone
They did put a much better camera in the Studio Display but it's significantly thicker than the iMac; I don't know how thick it is without stand/mounting hardware but the VESA mount version is 3.1cm thick so the Display itself much be at least 2cm.
I would pay good money for a solid dedicated camera for video calls. I know I can just use my phone, but I'm one of those weird people who only has one iPhone and it's really annoying to have to undo my camera rig every time I need a 2FA code. I would happily spend hundreds of dollars if the product was good.
I didn’t see any mention of an important one for me. Size. Many of the higher quality webcams are massive. I actually had to buy a shitty webcam to find one that was a size I was happy with. I take my desk aesthetic really serious to me so that was a big reason I returned all my webcams.
I used to be able to set the exposure manually on my webcams in windows (more granular than the 7 levels I'm offered now). That option just disappeared a few content updates ago on win 10, so now I can't even fix my own lighting. I think it was mjpg support that was just removed?
What do other folks think about the impending ability to use an iPhone as the camera for your Mac, with bokeh/blurring built in?
One thing I wonder about is whether the largest phones will be too heavy, especially for laptops that were built before this capability was anticipated.
Even worse (for me), is the fact that there aren't any decent and affordable microscope/desktop cameras for home/semiprofessional use. Unless you shell out a few thousand dollars, you get the same crappy camera modules that webcams use.
On the consumer side: not until people can actually afford casually dedicating the bandwidth necessary to deal with multiple 4k streams. Because either you need a truly monumentally liberal internet plan for sending uncompressed 4k at a normal frame rate, or you need a hardware encoder (baked in to your motherboard or discrete graphics card, or as separate purchase) to make sure you can send compressed 4k. And that only covers the sending part, you need to also receive and have the hardware capable of smoothly rendering the 4k video stream you're receiving.
But more importantly, on the industry side: given that live TV broadcasts are still 1080p, the answer here is almost certainly "not until the broadcasting world decides live 4k by default is even remotely worth it."
What's the financial incentive? Everyone who needs 4k streams is making money off them, and spends money on fancier camera gear. Most people just don't need it, so there's no money to be had.
It's hard finding a 60fps 1080p webcam that has a deep in focus region to minimize focus hunting and that isn't bad image quality, and that's way more useful.
You can download the app "iriun webcam" and then run your iPhone or Android phone on the same wifi network and it will feed your phone camera in as webcam input. No tech setup needed, just download the app and you're done.
All of the primary reasons I purchased are great. Only a couple snags: the cord was way too short if you don't have a compatible port on the back of your desktop monitor, and the camera gets warm and stays warm. Too warm.
The software runs constantly in the background. The camera runs warm when it's in use (I expected this), but it also stays warm and "doing something" despite not being active when I put my Mac to sleep (screens off). Unless I unplug it, the C1 stays warm as if it's engaged and running. I reached out to their team, and they said "just unplug it when you're not using it." Being forced to unplug it kind of defeats the purpose of having a high quality desktop camera set up.
Opal's team promised a fix for this in a software update, but I've had it for three months and have received zero software updates (though I noticed they have done alpha test releases). None of these have come to GA to the point that the subreddit routinely asks if the company is still going.
Overall, I really like the camera. Howe ver, Opal as a company certainly has reason to get more active and deliver quality if they expect us to start paying a subscription for the software.
Even if i hook up a great camera, chrome/Firefox's webrtc will squeeze my bitrate and lower res to 720p. There is no way from the client to upgrade basic quality for Meet/Teams/Zoom web apps.
The webcam in the MacBook Pro seems to be way better than most. Complementing that is an extremely good mic. Arguably these are two of the truly "professional" touches in their line as of late.
It honestly doesn’t matter if you primarily use your webcam for zoom, teams, etc. you could have an 8k cinema grade camera and the image other people will see is still compressed to death dogshit.
The whole webcam market is in a sad state with little to no R&D and ridiculous prices. How else is it explainable that the 8-year old C920 is still considered a quality leader?
I wish I could use Camo. But my company doesn't allow third party software without admin approval. Makes the process of installing Camo substantially more obnoxious.
Well, there's Polycom , and their pro/business solutions were delivering an exceptional results back before Zoom . Idk about their current product line, though
Considering how so many people on remote video calls look like they just got out of bed (wearing a hoodie, messy hair and beard) - I want to see less of them, not more.
I bought a samsung a51 phone on clearance at Target and installed iruincam on the mac (for Linux I use droidcam). Much higher quality than a the Brio it replaced.
I am not sure. I looked for an answer but didn't see anything. It appears to behave like a perpetual subscription (meaning free upgrades) but I think you would have to contact the customer support team at the company to get the answers.
I've read Vitamin D can help thicken existing hair. It supposedly is good in stimulating new and old hair follicles. When there isn't enough vitamin D in your system, new hair growth can be stunted. (Or so I've read.)
If you feed 4K at 60fps to your computer, that's another thing it has to digest on top of your running application, and 4K screen presentation downsized to 720p.
Besides, have you ever wanted to see every pimples on the face of your co-workers?
The image needs to be good enough to capture body language and minor frowny attitudes, but a stunning picture won't fix the fact that your product is late, your co-workers are slow, and it's Monday.
Webcams feel like a deadend technology like self-rewinding cassettes on the eve of the release of the compact disc.
The fidelity is garbage relative to something like a DSLR inside an Errol Box[1] and miles away from Starline style holography[2]. Eye contact is poor. Putting in a software filter to fix eye contact should be universal practice at this point. Hell even getting basic ring light next to your lens is universal amongst gen-z streamers and completely absent from gen-x remote workers.
With the chip shortage mostly over I'm amused how many people working at MANGA, nominally our most proficient tech workers bringing in half a million dollar salaries, are just phoning it in with garbage setups, trivially fixed by watching a 15 minute youtube video on how to position your desk to capture natural light, use a wired microphone that you already have in your drawer somewhere, and check three boxes under settings.
I'm not proposing to get a studio, just maybe give your fellow humans the decency of picking up your emotional nuance if you're transitioning to not seeing them in person.
You know what's vastly worse than a co-worker whose camera has a sub-optimal image quality? A co-worker who's constantly struggling to get their overkill setup to behave, wasting everyone's time while they fix the mix on their audio or switch cameras because their mirrorless camera turned itself off from thermal overload yet again.
Literally just use a wired microphone for instance and not only will one have better fidelity but one also immediately gets past the "Can you guys here me?" one finds with airpods or fancy wireless gaming headsets.
Not entirely sure if this was meant as satire, but in case it wasn't: If I wanted to be judged on my physical looks or the quality in which those are presented to others over the wire, I would have been working in the film industry.
Genuinely believe a loss in fidelity moves us from acceptable telepresence to uncanny valley.
Landlines had exceptional fidelity and let one pick up the nuance of a person's emotional state. Today's cellular connections not so much. Thankfully things like Apple Facetime (audio only) recapture some of that.
Likewise with a high quality webcam I think it's important to treat your coworkers with respect by giving them the grace of seeing your body language in high resolution if you're going to opt out of physical meetings. Remote work is fine, just don't phone it in with the equivalent of 2000's era potato camera.
I agree in that improved audiovisual quality would be beneficial, but, at least for regular video calls, I don't believe it's reasonable to expect for people to go out of their way to do better than the quality their (not too old) laptop provides - not past ensuring they're in a reasonably quiet environment with halfway decent lighting anyway.
Besides, recent generations of laptops are finally shifting to better quality webcams and microphones. We'll get to where we'd both like it to be, eventually :)
I intentionally scale down my video as I don't want to be seen. Being on video requires a lot of cognitive effort, which I'd rather conserve to do actual work.
I grew up in similar times. What am I noticing is that only one model of cellphone (no adv) can give me a decent sound of my companion. But I did not have that mis-experience using very old phones.
Interesting, I'm curious, what phone/carrier was that? If you were able to get a decent sound on one phone, it sounds like then there's nothing intrinsically/insurmountably wrong with cell phones.
There's really been a lack of innovation in the entire home office space, with the webcam being particularly bad. It sucks that a decade-old product (Logitech C920) is still the bestselling product today -- that would be like if Apple stopped releasing new phones after the iPhone 4S (launched 2012), and it remained the bestselling phone through now.
A few thoughts to add to the article:
- On why webcams aren't seeing innovation, I'd disagree that the market is too small. There's enough gross margin to produce a $B company just by selling webcams [0], especially if you can actually get customers excited about the product.
- A big reason there hasn't been innovation is that the space doesn't attract entrepreneurs (because hardware is viewed as hard) or investors (because hardware is viewed as hard).
- Size isn't everything. As the iPhone shows, you can get very good image quality from a tiny sensor and lens if you have the right tech supporting it. (At Lumina, most of our eng effort is on the software layer)
I would've loved to see Lumina in his comparison. We launched a few months ago and are seeing many reviewers prefer us over the Brio (Logitech's flaghip) [1]. Personally, I'd guess we're 60% of where we can be in terms of quality and think we can achieve a quality level between an iPhone and a DSLR, hopefully closer to the latter.
[0] https://s1.q4cdn.com/104539020/files/doc_financials/2022/q4/...
[1] https://www.windowscentral.com/lumina-ai-webcam-review