I'm the cofounder at Lumina - we're building a modern webcam designed to solve some of these problems.
There's really been a lack of innovation in the entire home office space, with the webcam being particularly bad. It sucks that a decade-old product (Logitech C920) is still the bestselling product today -- that would be like if Apple stopped releasing new phones after the iPhone 4S (launched 2012), and it remained the bestselling phone through now.
A few thoughts to add to the article:
- On why webcams aren't seeing innovation, I'd disagree that the market is too small. There's enough gross margin to produce a $B company just by selling webcams [0], especially if you can actually get customers excited about the product.
- A big reason there hasn't been innovation is that the space doesn't attract entrepreneurs (because hardware is viewed as hard) or investors (because hardware is viewed as hard).
- Size isn't everything. As the iPhone shows, you can get very good image quality from a tiny sensor and lens if you have the right tech supporting it. (At Lumina, most of our eng effort is on the software layer)
I would've loved to see Lumina in his comparison. We launched a few months ago and are seeing many reviewers prefer us over the Brio (Logitech's flaghip) [1]. Personally, I'd guess we're 60% of where we can be in terms of quality and think we can achieve a quality level between an iPhone and a DSLR, hopefully closer to the latter.
> Size isn't everything. As the iPhone shows, you can get very good image quality from a tiny sensor and lens if you have the right tech supporting it. (At Lumina, most of our eng effort is on the software layer)
This is my problem with all the webcam startups. So what if you can mask some of the problems of small sensors and lenses using machine learning that adds a whole new set of problems? You could have done that without even making hardware at all. We have plenty of crappy hardware out there already, and if yours is only a minor improvement with the "magic" in software then it mostly amounts to a hardware dongle to enforce your software license. No thanks!
If you're going to bother making hardware, you should make good hardware. That means a big sensor and a big lens. Start there, and sure go crazy with the machine learning afterward, you'll get much better results with less effort when you start with better input! And you'll have no competition because there's literally nobody else out there putting decent lenses on webcams.
I agree. It's a little strange to complain about a lack of progress in webcams the past decade, and then ship similar quality hardware to decade old webcams.
Why does everyone focus on a small form factor for webcams? Look at the size of microphones being sold today. The Blue Yeti is one of the most popular microphones and it's almost 30cm (12") in height. You might say, if you don't mind a large form factor, then go buy a DSLR or mirrorless camera. The problem there is they're not designed to be webcams. I watched one of the Lumina videos, and they were complaining the $2,000 Sony camera took forever to setup and configure for their demo. That's a problem. Someone solve it. Give me a large high quality camera that is designed to be a webcam first. I have lots of room on my desk, it doesn't need to fit in my pocket.
> I watched one of the Lumina videos, and they were complaining the $2,000 Sony camera took forever to setup and configure for their demo
Is that even a real problem? I started using my fuji mirrorless camera as a webcam during the pandemic and it was almost trivial to setup. All I needed to do was download some official software, plug the camera in and switch it on for it to start working as a webcam.
Same here: Used my Sony A7III with USB and their software as an excellent webcam and since I bought the new Sony A7IV, all my client always ask in video calls, what Webcam that is. At my new employer I had several coworkers asking me how the heck I did the insane background blur. (They didn't know it was a mirrorless)
If you have an old DSLR or a modern camera, you can use a 20USD Chinese HDMI Capture stick and get the best webcam on the planet.
Old DSLR - I think there’s an issue with latency that you’ll find on many older DSLRs - converting out to HDMI in real-time - that makes them impractical for use as webcams. Make sure you record yourself in a test conference first and verify.
I dunno what you count as old but my 8 year old Sony A6000 has basically no latency at all. You just have to look up some reviews beforehand and make sure that the camera you're looking at supports a clean HDMI output, combined with a decent capture card you can get rather great image quality for a reasonable price.
My Sony A6000 with a 35mm f3.5 lens was 350 euros second hand. Elgato capture card was 99 euros, a mini-hdmi to hdmi cable 8 euros and a dummy battery wall charger was 25 euros. All in all I get a 60 fps 1080p crispy image with great bokeh for under 500 euros.
There's always the Opal C1 that comes at around 300 dollars but imo the image is pretty terrible in comparison.
DSLRs can be pretty old - I have a Canon EOS on on the shelf that outputs a clean 1080 picture, looks great, and is over 15 years old. Looks great, but the latency in the video signal sucks. Probably same Elgato capture card you have.
You'd think that any DSLR from, let's say, last ten years would be just fine, but it all just comes down to the internals of the cam in question. I'm just saying it's wise to check the experience on the other end before putting an old DSLR into production. Out of sync audio is more distracting than lower quality video.
Speaking of cheap cameras - don't count out older camcorders either. An older (but quality) camcorder will get you drastically better image quality and control capabilities than a webcam too, same process as hooking up an older DSLR (capture card, etc). Same challenges to be aware of with latency, and maybe moreso. I've seen a camcorder produced in last two years that had too much latency in HDMI signal to be worth the trouble - it was lower end, but just illustrates the point that component quality varies and has an impact on your use case.
Except it cannot handle exposure changes at all, whether from a cloud or just moving around in the frame. So you AE lock and babysit. It’s just easier to link up your phone.
Smartphone cameras are small and far better sensors than what you get in webcams. They rely on software to produce images of the quality we're used to seeing, especially with less-than-ideal lighting, but the lenses and sensor are still far better than the Logitech C920.
A webcam doesn't need the latest and greatest video hardware. Give it a camera module from a midrange 2017 smartphone and use the latest image processing tricks, and you'll blow almost any webcam on the market today.
> Give it a camera module from a midrange 2017 smartphone and use the latest image processing tricks, and you'll blow almost any webcam on the market today.
Sorry, that's exactly the strategy of these webcam startups, and Apple with the Studio Display, and the results empirically suck. Even if you could get 2017 smartphone quality (which is still apparently an unsolved problem), why should you settle for that in a webcam that doesn't need to fit in your pocket or run on a tiny battery or cost a tiny fraction of the BOM of a much more complex device? We should be doing way better than that. And way better is definitely possible.
As stated somewhere else, at the end the video will go through very lossful compression and often be scaled down when used at work. I mean I'm now at my 4th webcam since the pandemic started because the other cams were: too dark, constantly zooming in and out for no reason (MS Cinema Lifecam), poor image. I now sticked with the Logitech C920 because I can manually set brightness/contrast/activate autofocus.
I think that understates how bad the current state of webcams really is. Esp. when you move around a lot while talking/there's an unusual light situation. IMHO first these core problems need to be solved (and then of course it'd be great be build upon that)
Bigger sensors and bigger lenses are exactly what would solve those core problems. Even after video call compression (which is not always extreme) it is trivial to tell whether a video was taken with a big sensor/lens or a small one. Compression is no reason to use a crappy camera. Garbage in, garbage out.
> So far no one has, there must be a reason no one can.
I am reminded of the story of two economists walking down the street. One spots a $20 bill on the sidewalk points it out to the other, who says "Don't be ridiculous, if there really were a $20 bill on the ground someone would have picked it up by now."
Welcome to what camera makers realized a long time ago - film doesn't matter, the processing does.
It's why DSLR was (is?) a thing - if you ever do research on one, you realize they factor in quantum mechanical effects to make your picture better...(in addition to the 20 or so simultaneous images they take in order to parallel process image).
I haven't kept up as much but - the flat sensor craze was all about how we didn't need DSLR anymore and could just get away with large pixel counts and better processing, and yet the processing still matters...far more than the pixel count. I think that was mainly because 100+ mirror systems are a bit expensive compared to a single flat sensor.
But when you think about them as 100+ parallel processing sensors peforming low-power auto-computation _for-free_, well.
That's another thing entirely.
I do wonder why some DSLRs suck at video - I suspect it has to do with whethey the manufacture cheaped out on the compression circuitry or not - some cameras can/could should shoot HD video, a lot did not, leaving that the pricier models.
At that price point you're competing with a GoPro, which is likely as good or better at a lower price. All (or most?) current GoPros can be used as a webcam.
Actually I tried to do exactly this. Except the gopro webcam software straight up doesn't work and they don't plan on fixing it.
Its depressing really.
That would indeed put a dent in that plan :\ I suppose there are always DSLRs, as those are also sometimes in a similar price bracket (many recent ones can act as a webcam as well).
I gather support / fixes / etc have been rolling out rather slowly, have you tried it in the past few months? Or is it currently that unusable? I certainly wouldn't expect the software to be difficult to build, there are quite a few webcam-emulating projects out there... but then again this is a company's software rather than an interested hobbyist.
GoPro is too clunky to carry around for use with a laptop.
Meanwhile, sensor and lens from a smartphone camera can fit in a small enough form factor that it makes sense to carry it around alongside your laptop.
Almost all laptops have a camera already, crappy as they may be, they still work. I don’t think laptop users are a big market for webcams? Isn’t it more for desktops where you have the luxury of space and bulk?
Growing trend in a niche market I would suggest. I know people who have 2 laptops - one for gaming, one for work (with webcam etc). Because the gaming laptop is too bulky to carry around, it's effectively a desktop.
if you are a gamer, you don't use a laptop. Gaming laptops are great portable desktops effectively. They have little autonomous time gaming or heavy compilations. So they have to be plugged in - they still are not great as a desktop gaming station, though.
I personally use a 'gaming' laptop as the default work computer (it's over 3.7kg w/o the charger which is another 600g)
A low end mirrorless camera is a fantastic webcam -- the only problem is it's clunky to set up that way especially on the software end. If you made one that's plug and play I would buy it.
Depends on who you are. "Nobody" would spend 1k+ on a desk mic, except unless you are upping e.g. your podcasting game.
I suspect a lot of vloggers, or at least wannabe, can and would easily shell out a measly 500$ for a webcam.
I agree, for a lot of people who just want to webcam mom that's going to be overkill, but if you have money, or do it professionally (I don't mean a remote work job) to the point where you are putting on makeup, investing time, often to the tune of thousands or tens of thousands of dollars, its a no-brainer.
The Studio Display does have a 27" 5K P3-color monitor attached to it, which drives up the price a bit...
Seriously, though, in the context of webcams, the Studio Display is kind of a counterpoint to "just rip a camera module out of a few-year-old smartphone and it'll be terrific". That's pretty much what Apple did here, and it's rather notoriously, ah, not great. (I have one, and I think it's better than some people give it credit for, but I know I'm comparing it to Apple's laptop webcams, which have historically been pretty terrible.)
Sadly, Apple's laptop webcams are still way better than the ones in competitors. I used an HP Spectre from 2020 or so and it had literally the worst digital camera I have ever seen, including the 90s. It's pinhole sized, and it made a brightly lit room look like a closet at midnight while being grainier than off-brand film. They might as well not have bothered.
There's no such limitation with a dedicated webcam, the kind you mount on top of your monitor. Yet these also are noticeably worse than what you'd find in a 5-year-old phone.
Resolution is just one factor. What processing you do to an image can easily make a lower resolution image look better than a high resolution image with horrible colors, bad contrast, poor tone curves etc. I'd describe the output on Macs as "unoffensive". It isn't great, but it also isn't bad.
Most people who dabble in photography and shoot raw will have some idea of this. The (untreated) images that come out of my professional DSLR don't look as pleasant as those that come out of my iPhone. Even if they have 4 times the pixels, and each pixel has much more dynamic range. How the raw sensor input is interpreted, influenced by intent, and then rendered, can be the difference between something you would hang on your wall and something you'd delete.
I have half a dozen macs with cameras, including the 16" notched M1 Mac Pro. None of them produce a great image, but they won't stand out as bad in a video conference. My $1000 camera does stand out because of its poor image processing, especially in low light conditions. It is a bit sharper, but the colors look distinctly off.
For what it’s worth, I agreed with the cynical view of GP until I read this comment. A very convincing defense of the prioritization of the software layer.
Completely agree, this is the old adage: "Garbage in, garbage out". Improving the data the sensor collects is surely much easier than using ML models to correct after the fact. The people willing to pay a premium for nice big lenses/sensors are the market to go after, everybody else won't care since they already feel their current set up is good enough.
I expect mobile phone cameras to kill webcams. All the innovation happens there and when people already have something getting closer to dSLR quality each year, why invest in a specialised webcam?
Apple just needs to bake webcam sharing/remote camera control into iOS and come up with a nice attachment for the phone that works with MacBooks and displays so you don’t have to look at the camera at a weird angle.
Maybe some magsafe sticker behind the display so you can snap your phone to the back of the display easily, with the main rear cameras pointing at you above the display
> Apple just needs to bake webcam sharing/remote camera control into iOS and come up with a nice attachment for the phone that works with MacBooks and displays so you don’t have to look at the camera at a weird angle.
> Maybe some magsafe sticker behind the display so you can snap your phone to the back of the display easily, with the main rear cameras pointing at you above the display
You do realise they just did exactly this, right? :)
But look at it and see why it doesn't really work.. It's an awful contraption, and your phone is tied up for the duration. And I doubt it will work well with every case either.
Apple already did this as discussed, but I’m going to make a different point:
This whole idea is against what Apple wants to be and so I think they will kill it (“evolve past it”) as soon as they can. Using an iPhone as a webcam is below Apple’s threshold of sleekness.
Apple would be the kind of person who doesn’t have anything in their pockets because that would “ruin the lines”, and when they need to buy something at a store instead of carrying a credit card they just call their lawyers and tell them to buy the whole thing so they are free to take whatever they want and write it off inventory.
When they interact with someone who is not as sleek, they look down and imply they are poor. “Why would you carry around that big thing (card) when you could simply buy the store? I don’t know if we should associate with each other...”
I frequently use my phone while in a meeting. I can't see how using your phone as a webcam would be productive. Using a phone, like an old iPhone model you have upgraded from, would be neat, though. But there are a bunch of apps that let you do this today, I believe.
This appears to support just photos and document scanning. And also seems to use BT and WiFi.
I have a friend who lives in a high rise with incredibly crowded wifi spectrum. Anything wireless that involves latency sucks in their apartment. They tried to use an old iPhone for VC, but the jitter added by their crappy wifi made that a non-starter. By contrast, this product uses a USB connection, and I'm guessing would perform far better for them (and really anybody with crappy wifi, which is most people).
I agree that it would be awesome to use an iPhone in this context. But given the problems my friend has, and given the problems in general with mac wifi and latency, any use as a webcam would be far better if it were wired. Hopefully that will leave a niche for this company. That, and using Android phones, and hooking any phone to a Windows box.
Agree, the flat and tiny webcams aren’t really necessary at home or in the home office and I don’t know anyone who carries even the smaller webcams around.
As long as it doesn’t look too bad, a larger lens and sensor would be perfectly fine (think of the original iSight camera, something like that would totally be acceptable) and I’d rather not have my computer be slowed down even further by the webcam doing some magic in the background.
Agreed. Optics aren’t that hard. If you put a good sensor in something with a form factor similar to the old iSight camera, seems to me like it would be a winner.
People seem to have embraced those stupid circle lights everywhere, why is a decent camera a bridge too far?
I bought a Lumina in December. It still doesn't do automatic tuning of the video. Everything requires manual adjustment. I can't just have it...work. It requires tons of adjustments. It even ships with a color card, which I assume is to automatically tune things, but it fails miserably. I have tried tuning it a dozen times, but maybe I can get it right until the light changes (I am by windows). Finally, after about 5-6 months of giving leeway, I gave up and uninstalled everything, put it in a box in my attic, and went with a 5 year old webcam that doesn't look amazing but doesn't look bad.
I have noticed you keep shipping a bunch of random features. I could be wrong, but my recommendation is to try and get it to automatically work right for 99% of cases without manual user input.
Agreed. I backed it and was really hoping to like it. I've since switched back to my Brio. I felt like it took work just to get the quality I was used to with my Brio. It felt like it added more steps and the end result was only on par.
Does Lumina work on Linux? I can't really find anything about it (there is also a Lumina Desktop Environment for Linux it seems), and while it says "Lumina is compatible with Windows and Macintosh machine" this can mean "we don't directly support Linux, but it should work" (like Logitech), or it can mean "we have specialized drivers/software that only work on Windows and macOS".
If your software works only on Windows, you've written non-portable software. Likely, you have all your core logic mixed up with Windowsy UI and now you're stuck there.
If it works on Windows and Mac, you've written portable software but just can't be bothered to support Linux. At a guess the problem is that you want to ship binaries only and don't want to deal with doing all the packaging and testing for Linux.
For a consumer hardware company, one wonders what the benefit is to binaries-only unless you're hoping to charge a subscription for the software to run hardware you already bought. Or the software is such a mess internally that you'd feel ashamed of it were public (which I can understand, having seen commercial code). Or you have some security feature that you don't want to reveal (e.g. disallowing use of your software with third-party hardware). However, if someone wants to reverse your software methods, they'll find a way.
Perhaps naively, my thought would be that if you want to sell hardware and not software, an excellent consumer-grade and consumer-priced camera platform with an open hardware interface and a decent reference implementation would actually find a lot of uses on embedded (i.e. probably Linux) platforms.
When you write drivers there's no such thing as portable code. You need to target different APIs, different environment assumptions, etc. Sure, there's some common core most likely, but I think this rant is misplaced when talking about a hardware company.
The software postprocessing is a different question and could be more common... But you're still specialising it for DX / v4l2+gui / whatever macos uses. That said... I'd kill for an independent library for video processing that specialises in camera outputs - beyond different profiles, there's no technical reason that couldn't be used independently of hardware.
Maybe in 2002, but it's a USB webcam in 2022, if you're writing your own drivers from scratch on any platform rather than using the standards-compliant ones, I'd first wonder if you haven't got a case of NIH syndrome rather than an actual case of being constrained by the standards.
But maybe they genuinely cannot use standard protocols for some reason, then sure, writing a Linux driver might not be worth it.
I'm half assuming they do something special, half responding to the comment which describes them as a hardware company. But yes, ideally a webcam should have the basic functionality available just by plugging it in, no special software needed.
The postprocessing should be done in the cam though, IMO.. And it should just give the result back ideally in a standard format.
But I would prefer better optics and sensors over AI stuff like many others have mentioned. When you have good video you have much less need to fiddle with it.
I worked for a company that did pre-infiniband HPC interconnect drivers (Myricom) in the 2000s. We supported N-different *nixes (Linux, Solaris, FreeBSD, Tru64, Aix, and probably more I'm forgetting), Mac OS X and Windows in the same codebase. At least 50% of the driver and 95% of the userspace code was OS independent. In order to do this, our drivers were not "typical" drivers. Eg, we used ioctls everywhere (even on OSX) for driver/library communication.
I don't really get this. Linux has had the same problem with audio interfaces, and you know what the solution was? Class-compliant USB peripherals. That's probably what people are asking for here: nobody wants to install additional software to make their webcam work, so why can't the processing be done on-device (especially if they're charging $200!).
If there are selling undifferentiated hardware supported by smart / magic software, they need to block people from running their fancy software on somebody else’s cheap hardware.
If they want to get paid for the magic, it will be more expensive to buy hardware from them than it is to buy equivalent hardware elsewhere. How do they get you to buy their product?
My thinking was that it might use some platform-specific APIs for that AI stuff it mentions, or maybe it requires some non-standard extensions because of it. Whether that's a good design or not is kinda besides the point here.
I might be interested in this camera, and asking support will likely only generate "we only support Windows and macOS" without any nuance, so I figured I'd ask here. Better ask before spending $200.
It may be confusing because it's in reply to "Does Lumina work on Linux?" but doesn't answer that question. The tl;dr of the answer would be, "there's reasons it might not work, but I don't know"
From the Obsbot site: "The OBSBOT Meet series hardware is able to work with both macOS and Linux, but we will not make the exclusive software for the Linux operating system."
webcams on linux with good driver support are not just for use by humans but in a myriad of industrial/embedded applications, for single image acquisition, etc.
don't just think about people using linux desktop PCs in home offices, but all forms of other use of camera systems.
I can understand the reason for believing there's no overlap, as Linux users tend to shy away from tracking-like features in general (which is the job of a camera!). But a few privacy features like a physical lens cap, and a hardware light that turns on if the camera is powered (not controlled by software) would really incentivize that market, I think.
Linux users still like quality hardware, and are willing to pay for it.
> I can understand the reason for believing there's no overlap, as Linux users tend to shy away from tracking-like features in general (which is the job of a camera!).
I don't agree. This is not the same 'tracking' as the one that Linux users generally abhor.
I'm a privacy nut and I track a whole ton of stuff around my house by the way and for my health etc. The difference is just that I track this stuff on my systems. Not on Samsung's or Xiaomi's or whatever. I use their brands actually but with privacy-conscious software. For example the amazing FOSS "GadgetBridge" app works great with Xiaomi's smartwatches and fitness band, giving them zero internet access and thus no way to track me. Yet it still collects the data for me. I know I should support a FOSS hardware product too but the only one around, the PineTime, is too premature to be useful IMO (I do own one). For my house stuff "Home Assistant" is amazing and it supports many health devices like scales also.
So, tracking is not bad. Companies tracking me is bad.
> But a few privacy features like a physical lens cap, and a hardware light that turns on if the camera is powered (not controlled by software) would really incentivize that market, I think.
Yes a lens cap is a must. I bought an aftermarket one for my Logitech C920. It seems to be something that's more on the radar now as many laptops have them again (like Lenovo).
> Linux users still like quality hardware, and are willing to pay for it.
In attempt to disabuse this notion that "nobody uses linux"...
Spoken like someone who only eats from fast food restaurants.
Fast food, for the most part, is horrid, disgusting, greasy, bad for your general health. Sometimes, I'm out and about, and I eat from a fast food website.
I also cook. Lest you think I'm in some minority, the cooking business is a billion dollar industry, with multiple "trendy" startups (hellofresh, blue apron, etc.) selling pricey services. I've even used some of them.
Yes, some people only eat fast food. Some people know no better. Yes, eating non-fast food can be incovenient, but its incredibly good for you... and just because we do good things doesn't mean we are all starving artists or paupers.
But no, we are probably not sitting 400lbs overweight surrounded by burger king wrappers filled with xenophobia at anything that's not "like us".
I purchased and returned a Lumina in January. I was hoping it was "good enough" to replace my current setup (Fujifilm X100f with a battery blank and Elgato camlink) and free up my family camera to do family camera stuff again.
It was better than the built-in camera on my gaming laptop, but nowhere close to what I was expecting (even moreso because I was on Windows). More "okay-to-good webcam" than just "good" quality without pairing that word with "webcam." I'm on the waitlist for the Opal C1 and will give that a shot, but honestly might spring for the next-gen Fuji and turn mine into a permanent webcam (too bad my first generation X100 isn't supported by their webcam software update).
1. If you're saying that it's mostly software, then you shouldn't have created hardware. You should release and sell software to run in tandem with existing cheap webcams. Not a low-quality webcam with some magic software.
2. Saying the iPhone is proof that you can make good, small cameras is bunk. The iPhone has an enormous amount of horsepower at its disposal. And an enormous team of software engineers (in addition to hardware engineers). A small startup with a couple of engineers isn't going to replicate Apple's camera team. Plus, are you aware just how incredibly powerful Apple's SoC is? To replicate it, in addition to the world-class engineering team, you'd either need to offload all that to the host CPU (which won't have dedicated hardware for it), or build it in, which you can't for the price-point you'd need.
I recently bought the elgato FaceCam, and honestly it's pretty good for $150. Much better than Logitech. It has a pretty large sensor and good software (and no autofocus!), but there's nothing magic about it.
The iPhone has a high-res display, hi-capacity battery, associated regulator hardware, multiple SoCs, cost of supporting a software/firmware engineering team for the OS/drivers, app developers, UX/Design experts, company markup....
Not cheap. I don't buy this argument - they have a nice image processor, sure, there's nothing special about their image processing. Sure, its not going to be cheap, I think you are vastly overestimating the quality... for a price point of 1K per phone, even at scale, you are talking maybe a 5th operating costs (200$) going towards supporting that.
Remember Apple makes its money from app commissions, more than the hardware, and AFAIK they don't operate selling hardware at a loss.
You will need a SoC team, yes, you are looking at a multimillion dollar company, doable with VC I'd imagine. Or a team of brilliant-yet-bootstrapping individuals, willing to work for less.
What sucks most is that Logitech, which used to be a proud name (back when "Feels good / Feels better") has gone totally to shit. Even their mice are designed to fail in a year. (Spill on the desk, the mouse that only got damp on the bottom is dead, dead, dead. That can't be accidental.)
A full article on what went wrong at Logitech corporate could be enlightening, if not helpful.
Maybe I'm lucky, but I've got half a dozen Logitech wireless mice/keyboard-and-trackpad media keyboards in regular use, plus the MX Ergo trackball, plus a couple of G Pro keyboards, and I haven't been able to kill any of them even through pretty rough use on the mice particularly (they tend to get bounced around in the living room when the dog jumps on the couch, etc.).
"Can't be accidental" reads as conspiracy-huffing. Lemons exist, bad products happen, but this is a weird assertion without evidence. Like, I got bit by the MX518 double-click bug back in the day, but everything I've used from them since--and I won't buy Razer because their stuff's awful to look at, which has historically cut down the options a lot--has been unremarkably fine. Except for the MX Ergo. That thing is remarkably excellent.
You talk like you have never heard of Planned Obsolescence. But it has been in business school textbooks since probably before you were born.
Whirlpool refrigerators have a "light control" circuit board to make the dome light come on slowly when the door opens. It has two resistors carefully underspecified to fail after warranty end, requiring a $150 replacement board. (Lots of pics online.) Good for them, bad for us. A $2 transformer and diode would last forever.
Whirlpool washing machines have a Motor Control Unit board carefully designed so parts on it literally explode shortly after warranty end, requiring a $300 replacement. (Lots of pics with exploded parts online.) There is of course no need for the parts to explode, and commercial washers from the same manufacturer do not share this failure mode.
There is literally no legitimate reason why moisture on the bottom surface of an optical mouse should possibly have any effect on its operation or longevity.
> There is literally no legitimate reason why moisture on the bottom surface of an optical mouse should possibly have any effect on its operation or longevity.
Sure there is. You got a lemon with an unexpected gap in the base panel and electronics don't like water much. That happens; that doesn't mean it's a defect by design, and the shouting about it needs a lot more substantiation than "well have you heard of this thing in another industry entirely?". Tolerances exceed, stuff breaks, warranties are sometimes necessary. This sniffing about capital-P Planned capital-O Obsolescence when this stuff works in the main quite well for a lot of people (and, as mentioned, my own pretty wide array of hardware from them) has a long road to hoe to substantiate it.
You buy a dishwasher once a decade or so. You buy mice rather more frequently than that. From where I sit, there's more incentive to hand you a decent product so you come back and buy another from the same manufacturer--and, as it happens, that's much of why most of the input devices kicking around my house and my studio are Logitech, because they're generally a decent floor of quality.
Lifelong user of Logitech trackball mice, now using the MX Ergo. I also used a Logitech keyboard for many years, and am still using a Logitech webcam. I've had an equally pleasant experience of reliable hardware with good Linux support.
No need for conspiracy - their mouse buttons are run out of spec which guarantees the infamous double-click. This is an issue on pretty much all their (esp. wireless) mice.
Great mouse but always am missing the scroll wheels from the MX Master. As soon as they release a new MX Ergo with upgraded wheels, I’ll be upgrading. Just smoosh the two together and it will be excellent.
Their mice still have the same terrible button that they had 17 years ago when I bought an original MX 518 as far as I can tell. Ive fixed the button so many times on every mouse from them since them. I even made the mistake of buying the re-released mx-518 a year ago thinking that since they put this new fancy sensor in maybe they fixed the button finally. It took 4 months for me to get a double click problem from it again that required me to open it up and fix that damn microswitch. I don't know what mouse ill get next but ive completely given up on logitech.
One of the reasons their buttons suck is: effectively running the microswiches out of spec. D2f-01f requires min 5V with micro loads[0]. There would be a tiny spark during switching and over time that would prevent a perfect contact, the higher voltage helps penetrate through. Omron has factories in different countries, e.g. Japan and China - with the China switches "known to be not as good as the original ones". Of course the mice feature the China versions too.
In other words - your best bet is still replacing the switches to ensure longer life... I wonder if Logitech would ever consider stepping the voltage via capacitor charge pump to prevent double clicks.
True, it was a great brand when they were growing. Then they got big and the bean counters started shaving away at quality. See all the problems with the G815/915 keycaps (I looked into it because I was thinking of getting one)
I use Microsofts $5 "Basic Optical Mouse" mice now, which generally me for years and is surprisingly comfy (though typical $5 creaky plastic). I have RSI so I'm fussy with mice but this one is pretty good for me. So even with beancounting it's possible to make a quality and lastign product. I've only had to replace two of them in the last 10 years or so (and I have 6 of them in use). Both were still working even but the plastic was just getting too creaky and glossy from all the use.
The software is an abomination too - they switched from a native program to an electron app that I'd describe as "occasionally functional". And you do need to use the app to do simple things like checking battery of wireless headphones.
The G502 is a reasonable replacement, though the shape's a little different. If you're good with wireless, I like the G602 as my normal go-to games mice.
> that would be like if Apple stopped releasing new phones after the iPhone 4S (launched 2012), and it remained the bestselling phone through now.
If I can be a bit cheeky - this sounds like exactly the kind of frankly ridiculous comparison a founder would make about their product alright.
The reality is - nobody really cares that much. Whatever image you record is going to be gigga-smashed by whatever application you squeeze it through. I switch between the rubbish builtin on my laptop over wifi, and a gopro tuned to the highest res and framerate the cable will take, with colour and exposure tuning etc.
Not once person has ever mentioned webcam quality in either case. As long as you can vaguely see _most_ of someone's face in _somewhat_ balanced light - that's good enough. For 99% of people in 99% of cases. Even job interviews where image is everything, it's irrelevant.
The problem that needs solving is audio. That actually matters.
I tried ping.gg recently which boasts high quality video and audio feeds (for a high price) and even then - meh. Video was entirely unimportant.
The only people who really care about live video feed quality are content creators with high powered static systems, like streamers. Even then, they can just hook up a dslr and smash it out of the park with little to no effort.
I don't see this as a real problem anyone bar a select few care about - and that select few has already solved the problem anyway.
> and think we can achieve a quality level between an iPhone and a DSLR
Good luck, but I think you're filling a spectrum nobody is concerned about.
I don't agree, my boss uses his iPhone a lot and I have commented on the quality of his video and I've seen tons of others do the same. This is with most people using their laptop's cams though which are admittedly terrible. And with MS Teams which is not very good at coping with poor video sources (even the free Jitsi is tons better).
Audio is important too yes but we all get noisecancelling headsets in work that make this a non-issue. I live in a really noisy street and most of the year I have the windows wide open (Spain :) ) but curiously they don't even hear it unless one of those d*cks with a muffler-less motorbike races past.
By the way how does the gopro work out as a webcam? Never thought of that but it might be an option.
> By the way how does the gopro work out as a webcam? Never thought of that but it might be an option.
It works very well. The newest versions can be used directly and natively in webcam mode. Possibly even wirelessly. I have a hero 7 black. I use a cheap adapter and it works flawlessly. One thing to bear in mind is processing power for real time is limited - you won't be able to stream in full 4k with stablisation and so on. But you'll get a high quality feed from a dedicated camera, from which you can tune exposure and so on. Also, the fisheye + different lens output selection is actually quite nice.
I commented elsewhere addressing the rest of your comment - selection bias here is extremely strong for "highly specialised tech nerds who care" when my argument is is "the world at large doesn't care".
> Audio is important too yes but we all get noisecancelling headsets in work that make this a non-issue.
It's not the audio out that matters. It's the audio in. Everyone uses shitty mics.
Because his opinion is supported by the status quo (crappy cameras everywhere).
My anecdata says the same. In most video confs your video is put up on maybe 1/16th of the screen, to the corner, and people generally share something like a presentation. I guess if you're in sales, it matters more since your face will be fullscreen.
But even then... most people don't work in shiny offices anymore, are you sure you want to display your dirty socks on the shelf in the back in all their 8k glory? :-D
Camera aside - people will do wonders for their video presence by putting some effort into their background and lighting. If you’re not going to use a blur or picture background, clean up the area, consider how it functions as a backdrop (including wash lighting, etc), and what impression it’s making. Lighting is king - be deliberate about it.
Completely agree with you on audio. I’m not entirely certain it’s solvable without cultural norms changing. Even the absolute best noise reduction systems cannot work in every environment and echo cancellation always has enough delay to cause people to “step” on each other. It’s just physics and propagation of sound waves.
At a minimum, normalizing the use of in/on ear audio needs to become the norm. Echo cancellation makes most “speakerphone” setups unbearable in my opinion.
I suppose my comment does seem like it’s talking about background noise. My major issue is echo cancellation. Background noise is a factor that really messes with echo cancellation but, if everyone is using on/in ear audio then it becomes less of a factor.
The amount of comments I got on my video quality once I set up my mirrorless with proper lights is insane. Nobody cares until they see someone with production quality video and they notice right away. It makes a difference. It's like if nobody cares if you wear a tee and a hoodie, but the second you break the norm and dress a bit nicer people do notice.
What even is an "AI powered" camera? You mean a device that will become useless when your company goes out of business in a few years, or shifts focus, or that only ever works on specific versions of Windows/macOS due to some required software?
No, what people want is a decent modern UVC webcam that adheres to USB standards.
I have the Lumina! I think the hardware has been pretty good from the very start when I got it about a year ago. The software originally was... not very good, but you guys have improved it by leaps and bounds over time. I still have to manually adjust the exposure (which is fine), and fiddle with settings to get the colors right, but it works better than any webcam I have so far. Keep up the good work.
I think mics and webcams suffer a simple problem: the person who enjoys the benefit of higher quality is not the operator. Indeed, the operator may never experience how they look or sound.
That might be true, but your reputation is at stake as well. For the same reason that you might wear a suit to an important meeting, you might also take steps to ensure that your video and audio quality are presentable when you're having an important meeting virtually. (Of course, once you've paid the cost for that, you can just do a good job for all meetings. Much easier than a trip to the dry cleaners.)
Not even just reputation. I have seen people with poor (often too soft) audio get 'snowed under' while trying to raise a point. It's much easier to ignore someone who is less 'in your face'. In that sense I think it's really important for effectiveness in meetings.
Appreciate the effort. I think one thing that hinders innovation is that, as soon as anyone makes a decent camera that sells well, a company like Logitech will just try a little harder and bring a better camera to market at a lower price point. It's not like this is rocket science, Logitech just doesn't have much incentive to try very hard right now.
Your product looks intriguing, does it support Linux at all?
One concern I have to a "smart camera" as a Linux user is that I won't get the simple plug and play experience I've come to expect from Logitech's products, as there's often a required software component/driver to get some or all of the devices functionality.
I know I represent a small part of the market, but I figured it doesn't hurt to ask.
I was thrilled to see Lumina and bought one. I unfortunately ended up having to returning due to a combination of buggy software and image quality that was no better than my laptop built in webcam. I really wanted to love it… but in practice it was no better than what ships by default. Also had tons of issues controlling white balance. The software depth of field was a bit better than what's built in to Zoom/Google Meet but still doesn't come close to the illusion of a real lens effect.
Ended up going with a much clunkier and expensive setup with desk stands + a Sony a7c + a 28mm lens + a Yeti mic.
Another thing worth mentioning is that lighting is key — so I got a 3 point key light setup (LumeCube lights) which works well.
It might've been a matter of Lumina still being in beta — I hope it can improve to a point where this comment becomes entirely obsolete.
i had to abandon my review issue of Lumina because the constant refocusing every time i moved my head was far too distracting for work. could you please give us an option to turn that off?
First, I like what Lumina is trying to do. I pre-ordered, and since receiving the first batch, have been, and will keep, buying them.
However, caveat emptor. Right now, buyers should support Lumina to support innovation in this space, but be prepared to pay a quality price while they work on it. Several ideas didn’t work out, and they’re trying to correct. Each software release is better, but I worry the hardware sensor in the shipping models may just not be up to task.
I have four Brio and four Lumina in our office (several more just came!). LogiTune + Brio 4K is surprisingly (given Lumina marketing including the post above) better than Lumina in most every situation, and drastically better in high contrast lighting where Lumina shows severe washouts and color banding. Some of the article’s shots show washed out yellowing on the author’s face. Make that problem worse, and you can picture what Lumina is struggling with in some lighting. It’s probably good that he didn’t do a comparison yet, though I would note that the most recent software release might have done better than 75% of those tested in his particular setting.
Meanwhile, if you like Lumina’s features but want better video, Logitech Brio 4K + Xsplit Vcam gives the same “virtual cameraman” pan and zoom, bokeh background softness control, virtual green screen, logo/watermarks, and more. Just like Lumina that requires a separate app for “pro” mode (aka software processed), Brio 4K (also most any webcam works) requires Xsplit Vcam to enable those features.
With Lumina the extra app is just enough easier for non-tech users to remember, that’s the direction I’m leaning for our staff.
Going forward, I’d pay 2x the Lumina price for (a) large aperture and high dynamic range sensor running off USB-C, and another 50% for configurable onboard features like the pan and zoom that wouldn’t require a helper app and confusing settings in Zoom, Teams, etc.
It’s certainly possible within the price point. What I likely want is something like this on USB-C:
The quality on this 4K sensor + lens array is remarkable, the optical tele-zoom fantastic, the low light gathering shockingly good. The price point includes a full PoE camera engine and an audio mic.
True, so that wouldn’t fit in a laptop bag. But iPhone optics would fit easily in a Brio housing, and long throw zoom would fit in the Streamcam form factor. Given videoconf is now a fact of work, I’d argue innovators should not be afraid of a larger device to achieve better quality.
Like the author of this article, while I used to use a Sony A9 for web conferences that matter, now I use iPhone + Camo for high value teleconfs. It doesn’t have to fit in my bag, I’m already carrying it. The innovation needed here is for the mount.
All this said, Lumina’s software is iterating fast, their cameraman is easy and works well, and I am buying more of the Lumina over the Brio for our staff.
// @rlei: a few usability notes (1) give us a toggle to turn off the default background image (e.g., show black instead) while virtual camera is panning in, the default scene generates annoyed comments from participants, (2) don’t flip from black to the live image until the initial pan and zoom is done, and (3) consider offering an eye-contact filter and marketing that. Staff who prefer Brio affirm they would switch just for that. Also, (4), don’t make users search email to find a serial number to use the helper software with their Lumina camera.
I don't think it even needs to be this big. These outdoor security cameras have big metal enclosures for heat dissipation, weather shielding, PoE circuitry (transformers) and other stuff that a webcam won't need. There's also the "bigger looks better" effect in this market. After all, most of the applications of these rely on deterrence and it's important that potential thieves notice them. I wouldn't be surprised if that case is half empty.
I'm sure they could make a webcam with these optics and processing in a respectable size/weight. I really wish they would, too.
I'd love to try a Lumina webcam but at the moment it'd cost me $200 shipping and import tax to the UK.
It's a pretty big punt compared to buying something off Amazon that I know I can send back without a quibble if it's no better than my Logitech Streamcam.
Nice! I'd like to ask: as there's a growing trend for (e)motion-contolled 3D personas/avatars driven my conventionl webcams, are you considering that as one of your potential directions of development? As a potential buyer, I'm in need of an inexpensive webcam with invisible IR backlight that would reliably capture facial expressions at moderate resolutions (up to 1280*960) even in a dark room without being distracting or harmful. Fast internal basic encoder (maybe primitive, but fast) would be nice too.
If you can do enough in pure software why are you developing hardware at all? Why not make your product a software offering working with any webcam? Or OEM an existing webcam with your software?
> A big reason there hasn't been innovation is that the space doesn't attract entrepreneurs (because hardware is viewed as hard) or investors (because hardware is viewed as hard).
It is so easy to throw yourself into software, I wouldn't know where to start as an amateur with hardware, it involves electricity and circuitry and probably months of learning. An amateur can scratch together a proof of concept in a few weeks.
Aren't people who start businesses in a particular space professionals in that space already? It's hard to imagine someone who is new to software jump in and try to start a software company
Not always, sometimes its just an interest, or it doesn't necessarily mean they have the technical means? For example a restaurant owner is not necessarily a technical expert, but could have an idea for a generic product that benefits every restaurant out there, and prototypes something, and gets funded and hires real devs to rework his prototype into a functioning viable product.
I have met and interviewed with YC founders who are not developers by any means for software heavy projects.
> A big reason there hasn't been innovation is that the space doesn't attract entrepreneurs (because hardware is viewed as hard) or investors (because hardware is viewed as hard).
Why though? It's all off the shelf stuff. If it's in phones you can put it in your product.
> that would be like if Apple stopped releasing new phones after the iPhone 4S
Would that be such a bad thing? Arguably there haven't been many technological leaps in the years since and if it weren't for planned obsolescence the 4S would still be a totally fine phone
Sad to see such an imperfect monitor mount type on a webcam that tries to be better. For better eye contact, lens should be as low as possible to be as closer to where you are looking as possible. This one doesn't seem to even try.
>There's enough gross margin to produce a $B company just by selling webcams [0], especially if you can actually get customers excited about the product.
Is this from HW only or some sort of MRR from SW lock-in?
Imagine you ship a webcam and then find out that one of the capacitors you used is undersized and the camera randomly reboots when another USB device is plugged into a nearby port. You just shipped 5,000 of these and customers are unhappy.
Unlike software you can’t just issue an update. You must manufacture new devices, issue refunds, send replacements. And all the while your competitor sells a very comparable $50 device so you can’t charge a subscription fee and make thousands off each customer over a lifetime. Software margins are huge compared to hardware and risk is often much much lower.
This is wild. What, how is this justifiable, why would anyone pay that amount when you can get an old
DSLR and produce amazing quality video with some tinkering, or get Logi Streamcam or similar for as good quality.
I don't understand the business model / viability of dedicated webcams. Software like Elgato Epoccam or OBS is free or cheap, and the % of people who would benefit from a decent webcam and who don't own a modern smartphone is literally 0.
People who have video calls throughout the day constantly. I don't want to have to mess around setting up my phone on a stand and getting it connected every time I need to do a meeting.
I would sound a slight note of caution there, in that AFAIK not every camera has clean HDMI out and there can be challenges about powering cameras for longer periods of time (if needed).
It is a good option though if you need higher quality. I'm using a Sony ZV-1 and it's worked out pretty well.
Those cost more. It doesn’t make sense to spend more than needed on something you’re only going to use as a webcam. Webcams don’t have to spend on holdability or physical durability or any viewfinding.
I also dual call into many meetings. For these, I use my phone to see how my screen sharing is working. I have to do a lot of screen sharing presentations to present data (tables, figures, etc). With this use-case, it’s very helpful to have a second login to be able to see what everyone else is.
Or I’m in a group meeting and I see a text or even a phone call I want to quickly respond to. Apple’s demo at WWDC seems interesting but not sure how much I would actually use it unless it was to partially repurpose an old phone.
As for cameras, I got things setup with my Canon 5Diii but it was way too much trouble to use given even pre-recorded Biddeford is most of the way there with a Logitech C920–and I’m better than most of the people recording videos.
Ive only ever had trouble with using phones as webcams. It seemed like a great idea, but ive probably wasted a good full 40 hours of my life trying to make them work right. And when I get it to work, 6 months go by and they force you to "update" to more broken features and then charge money for basic functionality. Whoops this software doesn't let you flip the camera the right way. This one doesn't like USB for some reason so lets try wireless, whoops it completely saturated the wireless connection with completely uncompressed video and drains my phone faster than it can charge. This one ghosts horribly if you move slightly too fast. Oh this one seems to be work... what the hell I dared put my hand to close or a light source came in view for a second and it completely bjorked the exposure levels and fails to readjust again leaving an either mostly black or mostly white picture.
Since I don't use it constantly, if I expect to be using a webcam I need to make sure I have atleast a couple hours the day before I need it to make sure it actually works when I turn it on again. And even then I still run in to troubles where certain programs see it and certain ones don't. Then I gotta push the video through multiple different programs to get the one program I need to see it right in the correct orientation. What I thought should be a super easy task turns into this huge ordeal and 25 new sketchy data stealing programs on my phone and computer. I even had a few which I can only assume were mining crypto currency with the amount of processing utilization they used even when the camera isn't on.
Well it's at least 1 because I bought a "decent webcam" and while I own a smartphone I don't want to use that for long zoom meetings.
I bought one of those Instagram influencer light rings so I actually look normal on camera instead of like a corpse, and the camera lives on top of my USB C monitor.
There's really been a lack of innovation in the entire home office space, with the webcam being particularly bad. It sucks that a decade-old product (Logitech C920) is still the bestselling product today -- that would be like if Apple stopped releasing new phones after the iPhone 4S (launched 2012), and it remained the bestselling phone through now.
A few thoughts to add to the article:
- On why webcams aren't seeing innovation, I'd disagree that the market is too small. There's enough gross margin to produce a $B company just by selling webcams [0], especially if you can actually get customers excited about the product.
- A big reason there hasn't been innovation is that the space doesn't attract entrepreneurs (because hardware is viewed as hard) or investors (because hardware is viewed as hard).
- Size isn't everything. As the iPhone shows, you can get very good image quality from a tiny sensor and lens if you have the right tech supporting it. (At Lumina, most of our eng effort is on the software layer)
I would've loved to see Lumina in his comparison. We launched a few months ago and are seeing many reviewers prefer us over the Brio (Logitech's flaghip) [1]. Personally, I'd guess we're 60% of where we can be in terms of quality and think we can achieve a quality level between an iPhone and a DSLR, hopefully closer to the latter.
[0] https://s1.q4cdn.com/104539020/files/doc_financials/2022/q4/...
[1] https://www.windowscentral.com/lumina-ai-webcam-review