i was at this talk. the attack basically reprograms the controller for the OSD (the ugly pop-up box when you e.g. change brightness or input source) over i2c. even though the OSD is usually just an ugly box, that’s because of laziness; the hardware/firmware supports arbitrary read/write of any pixel on the screen.
iirc you do need some super-user privs (or an escalation or bypass) to get the malware on the controller in the first place. i dunno if this is going to actually be used in practice, but it is interesting and a bit worrying.
the OSD controller is limited to a palette of 16 colors at any given time, i think. i guess you could workaround by having sensitive material rendered in a continuous palette; it would be a bit challenging to continuously rewrite the color table to sync up the evil version.
Like ^^^^ they said: no.
Even in the 20th century, if your display was digital, you couldn’t. A co-worker of mine was writing a Pascal program, and thought he’d just do that, so he dimensioned an array of 512x512 8-bit pixels. With the CPU we were using, back in those days, you couldn’t have a 512x512 array, because it was on a 16 bit processor, and that size array was bigger than 64K. You had to tell the COprocessor to do something with the pixels. The co-worker’s ‘array[]’ statement failed, I don’t remember whether it was in the pre-processing or compiling stage, but it failed with a ‘too big’…
Now we have 32-bit and 64-bit processor cores, but you still have to tell the GPU what to do. And it is another processor. It isn’t really reasonable to address each pixel with a general purpose CPU. Therefore, your digital monitor is now really a computer with TV-lke features. Although it might be possible to design a dumb digital display, it isn’t practical.
Now that we have DRM and the DMCA etc. etc. the GPU has become this unknowable parallel universe controlled by blobs you are forbidden to question. In other words, a natural hiding place for 'sploits.
that is largely true, but the exploit is totally independent of the gpu. it mods the monitor’s firmware through the i2c channel; the gpu just passes the instructions through, untouched, to the monitor through the display cable.
i think the major reason for blobs in the gpu is for trade secrets. i’m not familiar with any case of drm enforced in the gpu; do you have a reference?
a monitor could be kinda easily secured against this, but it would increase the cost and complexity and security is something no one really cares about, so it doesn’t happen, at least on the low-to-mid consumer level.
Really … this just cements that there is no way that I would ever do bio-hacking because … fuck … there is some people in this world who would hold my digital heartbeat for ransom … or run down my battery-pack for my spleen, or whatever.
I am getting sick of computers and internets, anyways.
Didn’t realize I was coming off as a conspiracy theorist. Planned obsolescence is a thing. Security problems accelerate it. Leaving the back door wide open pretty much jumps straight to obsolete.
I get emotional when completely functional devices become e-waste overnight, and just think! Our cars and trucks are headed straight for this same security conundrum, only more dangerous? Keep connecting them to the internet, I guess, because it’s convenient.
Pretty much all modern GPUs (including iGPUs like Intel’s onboard units) in consumer computers implement High-bandwidth DIgital Content Protection (HDCP), an encryption standard for display connections to prevent display capture for the purposes of piracy. The theory is to have the OS, video player software, and hardware, including the GPU (since that implements the display output) to collude to encrypt multimedia in such a way that it can only be read out by unmodified monitors at the other end of the display connection. Of course, like all DRM it’s subject to myriad issues both around efficacy and user inconvenience, but since HDCP support is widely available now and most DRM breaches happen at the much more convenient software level anyway rather than the hardware level bypass HDCP was designed to prevent it rarely shows it’s face to most users, even most multimedia pirates. About the only time I’ve heard of it outside of old hardware (close to 10 years old by this point) and free software/hardware communities is the relatively recent issue it caused when the newest consoles were applying it to games so that gameplay recorders weren’t working.
It’s worthwhile noting that the GPU enforces DRM between the GPU itself and the monitor, the whole system has to be locked down in order to get the full DRM channel from media file to monitor.
Why would the monitor need updates? Because everything is rushed out the door and the ability to fix something with a software patch is much cheaper and easier than having to replace a [monitor\laptop\cable box\toaster]
monitors were annoying to configure even in the 90s before they added this functionality you’re complaining about. i can’t even imagine how much harder it would be today.
yes, there are security holes and, yes, they could improve the security significantly; but i’d still never want to go back to the old days when you need a monitor driver or at least several pages of configuration so you don’t literally burn out your phosphors or break the scanning gun because you got the scanning rate wrong in software.
one of the only notably-used pieces of proprietary commercial software for Linux in the 90s was a tool to configure your monitor for the X Windowing System automatically, because it was so tedious. basically it was just a database/expert system of monitor configs, and even that had a warning that it could permanently brick your display.
Planned obsolescence is a thing, absolutely. Nvidia does not make the fanciest card it can make, it specifically buffers it’s production to produce planned jumps, while holding back some. Apple updates connectors regularly to force new hardware purchases.
Planned obsolescence has nothing to do what you’re suggesting, which is intentional security sabotage. Additionally, we should observe that ‘leaving the back door open’ in this case is more like ‘someone discovered that a previously solid piece of wall is actually a door.’ I don’t even know if anyone was discussing monitors as an attack vector.
The problem is that you’re treating this as an issue of fault. It’s easier that way, because there’s a problem: someone made a purposeful bad decision for malicious intent, and so you have an easy answer: that person is Bad.
What is almost certainly the truth is that this was an easy to make oversight and it exposes an awkward security gap that seems to require that you either get physical access to the monitor first or pwn the host PC, and from there you need to do some crazy work to get data out from it.
Edit: Let the record show I was wrong about the following part. (and it seems to need to be connected over USB, which isn’t ubiquitous. Couple sources mentioned HDMI but I couldn’t find any comments on it in the original (awful) presentations.) Re-reading makes it clear that displaying is enough, as it exploits an issue in the display systems
Huh, so the display has to be connected to the computer over USB (via USB-in) for these attacks to work? These “exploits” seem very … happens-in-lab-only… and theoretical… to me from a cursory look.
they developed it to work over usb originally, but then got it to work over the hdmi/vga/whatever alone, by using the i2c protocol.
of course it still needs a root vuln or something to deploy on the target. it seems really unlikely that there’ll be a case where using this trick really helps you out more than some other trick. it is cool though.
now if you could get an infected monitor to root a computer, that’d be fun times.
Apple does a lot of nasty stuff, but I don’t think they change connectors for this purpose. Look at their history of connector changes - they used the same dock connector on their portable device line for years, and now that they’ve changed to Lightning they’ve stuck with it even despite the genuine technical benefits of switching (interoperability with other phone’s accessories, Mac accessories, without the disadvantages of microUSB A) precisely to avoid this issue, and on the Mac side they use the same standard ports as competitors, with the switch to USB-C being more of an eager early adopter thing than an attempt to milk people for accessories (see also the new Chromebook using the same approach). Outside of that their connectors have been changing slower than their competitors; there’s a reason so many iPhone accessories using the dock connector were around years before microUSB based accessories.
I don’t know of a specific GPU exploit. My point was simply that we no longer have displays lacking processors, and that modern displays have 2 processors in them. And that digital video has required embedded processors from its very beginnings several decades ago.
Between the many processors and the fact that most people never get around to applying the latest software patches to their appliances, you have a truly huge attack surface with plenty of nonvolatile storage in the device.
And just because I don’t know of a GPU exploit doesn’t mean they can’t exist.