Originally published at: Opal OP-1 claims to be the first "professional" webcam | Boing Boing
Originally published at: Opal OP-1 claims to be the first "professional" webcam | Boing Boing
$300 seems a bit steep to me.
Why doesn’t some enterprising person create an app that will allow your phone camera to be seen as a USB cam when connected to your computer (I know too little, perhaps there are limitations/blocks within the OS that prevent that)? Then you could use that for zoom meetings. They’d be money in peripherals, too, like a phone holder that connects to your monitor. Seems like if you already have a superior tool that you paid hundreds (or even a thousand plus) for, you should be able to use it in more ways. Or repurpose an old cell phone for this. Win, win, winning.
You can use a phone over wifi as your webcam without too much hassle using something like Droidcam.
Opal OP-1 claims to be the first “professional” webcam
Well sure they do, adding that one word means they can double the price!
I do a lot of Zooming, and I’ve never had a problem with how my laptop’s camera makes me look, and I don’t much care if other folks aren’t coming through especially great visually. I guess those who make videos of themselves would care more, though.
While is true that webcams in general suck, the reality is that they don’t evolve much because most people find them “good enough”. That was not the case with cellphones, that have been touting camera improvements to attract sales for almost a decade.
And DSLRs are popular not only because of quality image, but also flexibility. I do own a micro4/3 for documenting my repairs and I could not replace it with a webcam (or a cellphone) because I swap objectives to record from macro videos to wide shots.
Here’s an excellent analysis of why webcams are so dire, even the stand-alone models you’d think would need to offer decent quality to sell:
Anyone who cares about the video quality probably already has a DSLR or mirrorless camera. Many camera makers have made software available during COVID to turn them into webcams, otherwise you can hook up their HDMI outputs to something like the Elgato Cam Link 4K. The only challenge is to find a way to power the camera as batteries aren’t up to 2 hour Zoom sessions, and not all cameras have AC power options.
And everything will still look washed out and wonky.
Because a good lot of the problem is in lighting.
People who want “professional” webcams use standard video cameras and DSLR/mirrorless systems. They’ll also tend know enough about lighting to at least improvise something.
Webcams are routinely garbage, but they are REALLY garbage if you don’t have light skin. It’s almost as if they are designed and engineered by companies without diverse teams.
That’s hardly a problem with only webcams. The auto settings on almost all cameras assume light skin tones.
It’s a bit compounded by the lighting issue I mentioned. Darker skin tones often call for more/different lighting.
We actually had an entire lecture in film school on how not to be that asshole.
I suspect that it doesn’t help that most cheap webcams have lousy sensitivity and absolutely atrocious dynamic range; qualities which are always photographic sins but work particularly poorly if important facial details are less reflective and a substantially different color from one’s teeth and sclera.
At least in the cases where I’ve been able to compare cameras from the same company(and so, presumably, at least similar levels of interest or disinterest in getting good results across a diverse sample set) the quality of the results has definitely followed the overall capabilities of the camera pretty closely.
I’d be curious to know how much could be compensated for through different firmware settings and image processing witchcraft, rather than just brute-forcing the issue by going from $50 worth of optics to $1000 worth; but I’m not sure how I’d test that without either a supplier of webcams designed specifically for the purpose; or significantly greater image processing chops than I possess.
Big part of my point is a fair bit of that can be offset by adding some light. Even really early digitals far crappier than today’s webcams, and old school “I have no settings” disposables can take a pretty good image if lit properly.
I’m thinking quite a bit. If you look at what Google has been doing on the software side with their cellphone cameras these days you can do a hell of a lot to over come sensor short comings. They have even been attacking the whole skin tone thing.
I don’t think post processing would tell you much, once an image is recorded the data is there or not. A lot of the wizbang you see with like “AI” upscaling and what have is essentially drawing in new detail from whole cloth. But provided one is a programmer who knows a bit about how such things work. It probably wouldn’t be all that difficult to knock together an alternate bit of software on the OS side. And there is probably something already out there.
And what’s that alternate Canon DSLR firmware, light something or other? It basically did this sort of thing in camera.
I do know that with my Logitech webcam, which is pretty crappy all things considered. Things look significantly better in the Logitech software than in the Windows setup or the controls of many of the video chat aps. And the minimal controls the Logitech software offers lets you improve things quite a bit.
It is quirky to get those settings to carry over to something else though. Cause unlike my Logitech mouse, where the settings can be stored on the mouse or otherwise fixed to it. The webcam software usually has to be open for its settings to stick, and in most cases things do not like pulling that video signal to two places.
I feel like half the issue with this shit is the totally crap standards and protocols around webcams. There isn’t really a reason why OS’s or software can’t just pull any video input as a webcam signal, why all cameras (including your cellphone) can’t just toss out webcam signal, a webcam input can’t be utilized by like ten thousand bits of software and equipment at once. With image properties set at the camera.
Regular video signals work exactly like that, provided you have the inputs. I was actually a little shocked at the beginning of the pandemic that regular cameras still aren’t plug and play as webcams. Cause I remember doing it all the time, even in the early 00’s. Then I realized I was attending film school and working in video. Where I practically always had access the inputs to do live capture and the software to do whatever the fuck I wanted with that.
I just don’t see why you can do it with regular video signals, but not webcam protocols and drivers. Or why we just don’t use the regular video signals. Some of this shit is stuff a VCR could do.
For tax reasons, my camera can’t record more than 30 minutes of video.
Avermedia made a 4K webcam with an APS-C sensor (which is larger than 7.8mm).
I have a Logitech 4k pro-- previously known as the brio. It can see in the dark. Don’t know how large its sensor is.
Yes, I can technically record from my real camera’s HDMI jack, but it’s a real pain to setup.
The camera itself. If it can put out a live video signal, that limit doesn’t apply. Because a live video output isn’t recording. The common way around that is to use the video output to record on something else, which is practically the same as how a webcam works. Usually direct into video software, back in the day you could do it directly to a tape deck.
Weirdly a lot of still cameras still don’t output a live video signal though.
IIRC the tax in question is in the EU and it’s slated to end at some point soon. I imagine clip limits might remain though, for product segmentation. Even before the tax issue clip limits were normal due to storage size limitations. You probably weren’t gonna get more than the then standard 7 minutes on a compact flash card circa 2004. And there’s money to be had charging extra for the upgraded, video optimized versions of these things (which is why I think it’s weird more still cameras don’t include video output and webcam features. It’s an easy add. The hardware is usually already there).
In th EU, I assume. A camera in webcam mode or using HDMI is not recording, and thus would presumably be exempt.
I’m not sure. It’s a D7000, and in order to get a clean HDMI out I had to install hacked firmware. Anyway, I can get a really sharp image of my face my using a macro lens and really dialing in that focus.
But, when one receives a message that the zoom meeting is in 15 minutes, the setup process-- plug in good microphone, plug in headphones, put camera on tripod, tune focus-- gets to be a real bear.
All cameras make me look fat and old. I use my imagination-cam, which makes me young and handsome.
The biggest gripe I have with the standard Logitech option and some alternatives is that they’re often wide-angle lenses… these seem to be designed for conference rooms! I just need people to see my face, not the whole room.
I use my iPhone with a stand, it’s less convenient but it seems better suited for individual faces (and the quality is much better!).
ad copy for the 930
The webcam must be really close to the subject. Attached to the monitor, at a 50 cm reading distance, a 90 degree field of view verges on panoramic.
Even Apple has crappy webcams. Their 16" Macbook “Pro” has a 720P camera instead of 1080P, and that doesn’t even get into the dynamic range and other issues. Plus apple has no manual overrides for the auto settings or manual controls of any kind, not even a brightness adjustment. I can’t turn down the brightness when the camera overexposes my face.
Because Apple and other computer makers have dropped the ball, and because setting up a DSLR as a webcam is futzy and awkward (I’ve done it multiple ways and have defaulted back to my inferior webcam for convenience), I definitely would consider an easy to use, high quality webcam with good controls.