$999 monitor stand explained

Glad you heard it from me and not from the Googles of ‘fucking tree branches’.

Whew!

1 Like

At least on the video side a lot of that stuff was MacOS only to start and it was more that Apple bought them to prevent them going cross platform. Final Cut for example was developed by Macromedia for both Windows and MacOS. Apple basically bought it out before release as Macromedia was trying to sell off the program/division and Apple wanted to lock it down as MacOS only. A lot of those buyouts happened just as their dominance there was starting to crack. So it always seemed like more an attempt to lock people in as they had less of a clear lead in terms of stability and power for that kind of work.

AMD’s went down a lot with the new product release, and are tweaking upwards as things tend to do with refinement. But the new processors are a lot faster by nearly every other measure. They do more work per hz than the previous ones. By a shit ton. Intel have tweaked theirs up as they’ve refined things, over and over and over for 6 years. But their recent chips are apparently running startlingly hot since they’re marketing heavily on slightly higher clock speed to try to counter the attention AMD is getting and they might have pushed it to unwise places. There hasn’t really been a lot they could do to release more powerful versions for a while.

Generally more cores means less clock speed since more stuff means more heat. And that’s been the case for a while. Its just the more cores is finally making inroads to consumer parts, and where software can take advantage of it its generally faster than fewer cores and higher clocks. One thing that did proliferate while everything stagnated is the ability of chips to dynamically clock themselves up and down, turn off whole cores, etc per work load. So a lot of those high core count parts can actually run just as fast as their lower core count brethren if the additional cores aren’t being used. That’s part of what’s up with all those “boost” clocks. Particularly over on AMD’s side where its literally all the same chip with different broken bits lasered off up and down the product stack.

1 Like

Wow I don’t think I’ve ever seen 100% Mac shop for video/film… There’s always been apps that only had windows/linux seats, and in my experience I’ve only seen architecture or graphic design firms that were exclusively mac. But plenty of films and commercials were cut in final cut pro on macs, and high-end vfx/compositing was done in shake (before apple killed it)… and eventually there was even an osx version of Maya… people bought it!

Creatives loved the MacPros because you could run them for months without a restart and you didn’t need a tech to repair them. And once they went to intel, you could also run windows and linux, so that same box was very valuable in a multi-os shop… You could cram it full of gpu’s (even nvida cards). People like myself switched to mac in that 2006-8 generation - I wasn’t a holdover from power mac days.

But I totally agree with you there. Like I said in my original post, if they had released the equivalent of this 2019 MacPro in 2013, we wouldn’t need to split these hairs (also if apple hadn’t killed all the pro apps they bought). In that timeline, shops could still have their artists using macs for a significant portion of creative work and that clientele would have driven demand for software development on the mac os platform. That didn’t happen of course. Apple’s obstinance in the face of criticism and retreat from the pro market led people like me to flee and I’m afraid that this machine is too little too late (or too much too late).

All I’m saying is ask some one who knows. If you’re trying to make sense of this release from that perspective, it doesn’t really work without the context.

Yeah I get that. All the reporting is the equivalent of “lul, Apple so out of touch.” Or “actually, this is totally in line with the pro market.” Neither of which is entirely false or true. The fact is Apple has never been clear about their intentions since they think that being sooper secret about everything is the best marketing plan always (it’s not).

Roger that. I totally agree and I get now more of what you were saying originally. These are my feelings on the matter too, so I’ll sign off and hand in my internet pedant badge while I’m at it :+1:

1 Like

IT WAS A THING. I wasn’t joking about the Apple branded glass doors and walls, though I believe that was a Dot com boom/early nineties thing. And the place I saw it was no longer Mac based at the time, they just didn’t see the point of replacing the glass with the big frosted Apple logos on it.

Remember until 99 even Avid was MacOS only, and Windows non-linear editing was basically Adobe Premier, which wouldn’t be “professional grade” until about the Mid 00’s. Final Cut pro was spear headed by the guy who designed the first few versions of Premier, breaking off because he wanted to create a more professional software.

It was still a relatively normal thing when I started film school in 02, but basically changed while I was in film school. So that by the time I was finished even majority Mac work flows seemed to not be a thing. Which was great, cause schools stuck to that longer. And everything I had learned was Mac based…

You’re basically talking an early days of commercial non-linear editing thing. In hind site it did not last long, basically very late 80’s to very early 00’s. Design seems to have stuck with it longer and some other fields. Schools were largely the last hold out.

Like pretty much any non-mac computer at the time?

That’s one of those weird things with Mac given how much they’ve restricted upgrades over the course of the last decade. We’re suddenly talking about the MacPro as “modular” when all that means is you had the same ability to replace and add parts as a standard PC of any other manufacturer at any other time. And a fair bit of the modularity they’re pushing with this new one is of the same sort (though that case seems very customizable). So literally what I as a person who has never owned an Apple product, have been able to do with any PC I have owned or used no matter how cheap and how non-specialized?

Now that’s a “modular PC” instead of just normal?

They didn’t kill all of them. They made a lot of them not pro, and made it very expensive to turn them back into pro software via individual purchases of expensive plug ins.

sub-buzz-7896-1536678797-1%20%20marram%20grass%20micrograph TheMarram Grass micrograph of ghosts and vibranium smileys compels us all to make living walls [almost does emotional work of checking LA union code, north LA power cut addenda, but is deflected by the jaguar print woad.]

Sauce: http://beyondthehumaneye.blogspot.co.uk/2009/06/dune-builder.html via buzzfeed
And yeah, it has been a while since I’ve seen an all-Amiga video shop. [ducks a professional trailer stoop] So that I’m wondering now that SiFive (RiscV) has $150M more, maybe they have super great Sony camera I/O cooking.

2 Likes

If you’re spending $5K or $6K for the best Retina display on the market, does $200 really make that much difference?

This is for institutions much more than freelancers or hobbyists. I’ve spent more than that on a hard disk adaptor for an Indigo.

1 Like

9 Likes

https://www.sailingworld.com/racing/mother-all-winches/

Ok so first Retina is Apple’s own weird branded, non standard group of resolutions. And even within their product line its an ever shifting standard based on pixel density rather than anything else. So Apple’s screens are the only Retina displays on the market. And being Retina doesn’t mean much in terms of “bests”.

What this is is the top of Apple’s line, and the cheapest reference monitor on the market.

And ultimately no. Even with the $200 add on its cheaper than alternatives. But as I said its bad marketing. Every other screen in this market segment, most other screens bar the very cheapest. Are VESA compatible out of the box. Adapters for those that aren’t are routinely pretty cheap. By making that a separate item, they’re making every single person who buys one aware that a simple metal bracket from Apple costs more than the stand you’ll be using it with.

That isn’t going to endear anyone to Apple, and they’re making themselves look bad needlessly. If they included the damn thing and just charged $200 more, they’d still be undercutting the market but that cost would be hidden. And they wouldn’t be giving the impression of price gouging. Or they could have just made the monitor, and their stand VESA compatible like everyone else and still sold the monitor for $5k.

1 Like

That … doesn’t mean anything to me. For starters, what’s “weird” about it?

What. No, it matters if you use Retina displays. Why would you even say this?

I’m actually more interested in this display than the MP19, because I can see myself actually making use of that for painting.

But they didn’t, and I’ll bet you that it just won’t matter in the long run, anyway.

Everyone will forget they were mad, the price will drop like it always does, and that will be it.

If they don’t like the price, they’ll do what I do—wait.

1 Like

The killed off Final Cut pro completely with version 7. They then introduced a totally new iMovie based, single edit window program and called it Final Cut Pro, but it was not the same program at all. They killed Shake. They killed Aperture. Logic Pro survived for some reason.

1 Like

Its an Apple trade mark. It does not exist outside of Apple’s product line and is largely not a thing outside their marketing. “Retina” whatever it currently means is not required for anything. Its a nice looking screen but there are other nice looking screens. If few enough in that particular bracket.

Across the product stack its mostly characterised by non standard resolutions not tied to outside needs or actually available media resolutions or the resolutions media is created for. So you get things like 5k screens instead of 4k, when no one makes anything like 5k video that would actually require it.

Apple claims the standard is based on pixel density and their own judgements about when pixels become invisible at a given screen size and viewing distances. Which don’t match up too clearly to actual research and common standards for such things. And they’ve repeatedly downgraded what counts at “Retina” at a given size pretty arbitrarily. As well as adding resolutions well above their own claimed mark for the sake of having bigger resolution numbers than competitors. And for a while there the resolutions changed every time they updated any screen on any device.

Leading to a pretty vast and ever shifting number of one off resolutions that software makers and web designers needed to deal with to make sure they displayed things properly. Though they’ve for the most part been standard aspect ratios, so less messy than it could have been.

Meaning Apple displays. Retina is just Mac’s brand of monitors. Retina displays are not the best on the market. Dell has monitors that go toe to toe with them in the same space and price points. But use standard resolutions. Which is important in some places (and less annoying everywhere). And if you’re willing to pay what you’d pay for a car there are significantly better screens available, especially among the reference monitor market this is targeted at. Many screens with less visual quality are “better” for specific use cases. Like gaming where high refresh rates and quick response times are as important as, and sometimes more so than, things like good color.

So I say that because that’s the case. What is significant here is not that this is a Retina screen. Its that its a reference monitor. Carefully calibrated For accuracy in color, contrast, and luminance. Which is usually not available anywhere near this cheap.

“If you use Retina displays” basically means “If you own an Apple device”. Don’t get me wrong they’re very nice screens. And I might have one if i had the scratch and the current specs worked for basic gaming. If I had that kind of money I’d definitely consider this new one. Since I’m a sucker for things like good color and contrast and they just made the hardcore shit attainable. But the word “Retina” is not magic, its not a class of device. And ton of other devices hit the mark for their resolution without said branding.

Apple was (still is?) Selling the 2013 MacPro with the same guts at full MSRP as they announced the new products. Apple products do not come down in price, particularly at this end of the product line. Not until they get replaced.

Don’t be silly. This $500 plug in will let you do the thing again!

More seriously they really did gut just about everything. To the extent that any of these still exist they have nowhere near the functionality or footprint they used to.

1 Like

Make sure the mounting brackets are at the same distance.

Make it from an oak branch.

Call it ‘artisanal’.

Sell it for $1100.

I presume an option for wall mounting will come
I don’t think the design enclosure was at an all time record
marques brownlee says the professionals that it aimed at
buy monitors without stands anyways…

and sarah paulson does not like holes smashed together
so she would just freak

What would you want XLR connectors for? On board audio? Not much point in that IMO. Either go out digital to DAC or get some serious (say: RME) external audio. Or are you talking about digital (EBU/ADAT) via balanced line? From an onboard chip? XLR connectors sure look “pro” but that will be all the use you get from it.

The Apple BarclayCard has always been a sick joke fraught with consumer abuse. People get them for the promotional zero-interest plan, and don’t necessarily realize they can never get the promotional rate again for future purchases. It just becomes a regular credit card with an awful interest rate managed by an awful bank. The only reason any consumer asks for that product is to get 0% financing on a large Apple purchase, so the only value it has to the bank is for the number of customers they can declare and for how effectively they can screw those customers out of the promotional rate they were advertised.

The last time I used (or will ever use) any Barclays Apple financing, my first statement arrived with a due date in the past, meaning they had already canceled my promotional financing, and their helpful representatives insisted that wasn’t their problem. Fortunately I was using the financing as a convenience and was in a position (with some minor discomfort) to pay the entire balance and tell them where to shove their card (with some minor discomfort).

2 Likes

Heh, I know what a Retina display is, I am using one right now. You’re apparently confused how I am using the language here—though I’m not exactly clear on how I came off that way, I’m referring to that particular category of displays.

So it’s not really “weird,” and what you’re talking about seems to have no applicability in a discussion over whether anyone might want the XDR near as I can tell. I didn’t buy a Dell, so I don’t really care what Dell makes.

And who gets to decide what is “nonstandard” anyway? Retina at least started out as multiples of “standard” resolutions, so your use of the term doesn’t make much sense here. Why does it matter? Are camera resolutions considered increasingly nonstandard as time goes on?

I guess I’m really not clear on where you’re going with all this.

Objectively they are/have been higher resolution-than-typical displays—indeed, that’s their distinguishing utility. If that doesn’t matter to you, that probably just means you’re not using it the way I do.

Yes, sometimes they also get replaced and they cost less. It’s almost like physics.

Just to be clear, although the way inflation works is still in effect anyways, that’s not what I was invoking, which includes price drops of devices which were concurrently in production.

Its not a catagory of displays though, not really bot the way “high res” or “hdr” or “high refresh” or Reference" are. It’s Apple’s product line. We don’t refer to Ford Trucks as a product catagory. And the “Retina” idea is more a marketing term that a standard or set of features. It literally only means “we have more pixel density” based on Apple’s preferred measure of the appropriate density, rather than well known marks from independent research used by everyone else. They also kept changing what their preferred mark was, and then just kind of stopped mentioning it.

“Retina” does not account for or refer to a lot of the things that make Apple’s monitors good. Like color reproduction, brightness, contrast. And their quality there predates the Retina trade mark.

You can not get a retina display from anyone else. If its got the same pixel density, its not a Retina display. If it has better Pixel density it doesn’t get to be a Retina display. When Apple decides that Retina means something else, they’ll go ahead and lower the resolution. When it behooves them they’ll just go really big and really dense even where the idea was to not be “wasteful” like that. And because of the results pretty much each size of screen they make has a different resolution unique to its size and unique to Apple. Though there are a couple these days using standard resolutions.

As a category Retina is just “Screens Apple makes and uses”. “User of Retina Screens” just means “User of Apple devices”.

Didn’t day it did. I made an aside and you went all “lets defend Apple”.

What utility is it if the human eye can not distinguish it from a very similar, but slightly smaller (or slightly higher) resolution on the same size screen? What utility is it if nothing you will display, and no media available will ever utilize or require that resolution?

From what I’ve been told here and by designer friends visual arts folks like the added desktop space that comes from cramming a higher resolution into a given size of monitor. But there’s other places/ways to get that.

What I meant is that Apple continues to sell items at full, original MSRP until they discontinue them. Particularly at the top of the line. Then the replacement has the same MSRP or higher.

Prices on Apple products generally do not go down, especially the “pro” stuff. And they strictly control sales at 3rd party retailers to limit or prevent discounts. The price on this or its mount and stand aren’t coming down. Not any time soon. A deal on them will come from buying used, or old stock if they ever get discontinued. A better price on stands and the mount will come from 3rd party options.

The only exception on this I’ve noticed is the phones and tablets. Where last years skus stay in production as the lower tiers of the product line. But just as often as they move down by dropping price, the new on goes to the top by costing significantly more than ots predecessor did.

Industry associations, regulators, the full ecosystem of media and what have.

The resolutions Apple uses have no source in any outside standard or use case. There is no 6k media in need of displaying. There is no 6k resolution authority that set the standards. There is no 6k web standard. 6k is not used for projection. There are no cameras that shoot in 6k. 6k does not neatly map to a photography format, or publishing needs. Video games are not looking to render at 6k. It is not tied to anything outside of Apple’s decisions. And the decision on the displays that are higher than 4k seems to have been dictated by “well five is more than four” style feature creep than any demand for those specific resolutions.

That’s what’s “weird” in this set of resolutions. They aren’t tied to anything. They don’t seem to scale from one another. And Apple has repeatedly arbitrarily lowered a lot of them. Its a messy set of screen resolutions that don’t make a ton of sense, and are completely different from what anyone else is working with. For much of the life of the mark they haven’t even been consistent generation to generation, or consistently trended up.

HIDPI plus lots of screen space. (for an IDE, for instance) What’s not to love?

I’m not saying its bad. Just that there’s not a lot you’re getting out of it besides that. And there are more sensible ways to go about it. Like you could just go with the next highest standard resolution on the same size screen and get more dpi and space instead of inventing an in between.

And the more recent stand along monitors seem to have switched to a “start from 4k increase by 1k each step” approach which makes a lot more sense. Discarding the “no more dpis than I say a human can perceive” standard. But the rest of the product line is still all over. Apple watches have Retina displays, and each Apple Watch and each version of those Apple watched has a slightly different resolution that only that one device has ever used. And apparently no device will use again.

Some of the laptops and tablets have had screens that are just a few 100 pixels off 1440/2k. So less dpi than the standards would give you.

Eta: and to clarify if that were the pitch and the goal I wouldn’t be poo pooing it. To a certain extent Apple’s stand alone monitors, and the screens on some of the computers, are pretty specifically specialized for visual art, design, and photo. Where those two features are prized.

Bit of what what i was getting at with not the best by every standard.

“We go higher than standard resolutions, extrapolated linearly from those standards, because these are for design and that is good for design” is an entirely sensible thing. But that doesn’t explain what the hell is going on with the rest of their screens and when it was purportedly a specific standard the pitch on Retina was basically “high dpi, but not too high dpi”. And now its just branding.