$999 monitor stand explained

You don’t suppose Apple might be taking a little something, do you? Perhaps for use of the brand name? You know, licensing and whatnot? Trump style.

Did I say they weren’t? I am confused as to the point of your question/reply; could you clarify?

Apple and nVidia have been on the outs for a while. Apparently nVidia screwed the pooch on something back when Apple dealt with them. I don’t follow Apple super closely so I don’t recall what it was, just that it was years ago, and Apple will not use them as a supplier since.

I don’t think lack of support for them is a result of that though. Apparently until recently MacOS didn’t even support AMDs normal consumer and pro line. You had to by specific for Mac cards, and the ones they included on the product line were different than the retail product line.

The new Mac Pro announcement was actually pretty interesting on that front, for the lack of attention its getting. The base model includes the productivity version of one of AMDs normal consumer cards. And the higher options revealed 2 previously unannounced cards based on a previously unannounced chip. A new version of the Vega 7 chip, and a dual gpu card based on it. Straight up “this computer does expensive math” stuff, but a fairly serious reveal from AMD.

But it’s totally organic, all natural, and carbon-neutral!

That was the Nvidia GeForce M8600 debacle where the GPUs were all manufactured to the wrong heat tolerances or something something something and they started to come apart over time. I know cuz I had one of those machines and got a free mobo replacement. Imagine if you shipped a hundred million dollars worth of goods then it turned out you screwed up and all $100 million had to be trashed. It nearly killed Nvidia and is the reason why neither company will do business with the other. But removing drivers of any kind for NV is basically saying you won’t let half the GPU market into your fancy new iron. No CUDA? People gonna freak. the. f!ck. out.

2 Likes

Recycled jokes are the best jokes…

1 Like

The fact that this is 100% optional seldom gets mentioned in these sorts of discussions, but I think that’s key.

If you want/need one, then this stand is for you. If not, then… not. Not hard.

It does seem to harken back (to my mind) to the Radius Pivot, but not everyone remembers their utility (or their cost, which was not insignificant).

Some do. Not all of my systems are, however. Its nice to have the flexibility, which the MP19 appears to be set to offer.

I’d be more interested in the new Pro if it weren’t—at its lowest config—absolute overkill for most of what I do (though I wonder how it might handle a 150 MB Illustrator CC file which brought my 2015 MBP to its knees earlier this year—but I seldom get those sorts of jobs nowadays).

The MP19 looks aggressively priced compared to HP’s Z series machines configured for the same tasks, and that display easily encroaches on the capabilities of $40K color reference displays at a much lower cost, but the concept of “pro” seems pretty fluid, and Apple has staked their claim on the 8K video side nowadays.

It’s also worth noting that I probably haven’t needed any “pro” offering in terms of capability on those grounds since the early 2000s. I don’t see that changing, but if it did, I’d probably be in line.

The thing I have seen from people who actually buy such things (which is not me at all) is that its in the normal rage for such things. But Apple doesn’t offer contract/enterprise pricing or proper enterprise level tech support. Their tech support option being the exact same Apple Care package offered on consumer units. Rather than the on site, replacement within the hour support competitors offer on contract.

So I’ve seen a number of people claim that they would not be allowed to purchase these machines even of they were interested. And they would be paying significantly more for them than other options even as the MSRPs are fairly close. If that’s true (and I wouldn’t really know) then it definitely undermines the “for the pros” idea, even if that’s what Apple is targeting. Because 2 of the most important things to that market just aren’t available with these.

I would expect those things to be announced at some point, cause it doesn’t really seem targeted at that market otherwise. You can’t really just say “its expensive so pros”, you should probably check with the pros first. And I don’t think I’ve seen any coverage where someone did that. It doesn’t take all that much effort to contact an IT guy at a company like that and ask “what do you guys actually pay for a computer like that, would you consider this specific computer, what does a “professional” stand look like.”

And that is something pros care a bunch about. Apparently less so these days then they used to. And apparently that support was dropped 6 months ago. That was the other quiet AMD announcement from Apple. Updated and expanded version of their alternative, which has apparently been slowly gaining in popularity but is being held back by lack of resources and attention on AMD’s side. So there’s that.

Yes and no. From what I’ve seen in the coverage the monitor does not come with any form of stand. So you either buy the $1k stand or pony up $200 for a VESA adapter, because Apple has never met anything it cant do proprietary. The adapter to use something else costs as much or more than that something else. So what’s not optional is paying much more to Apple than makes sense even if you decide to opt out of the stand.

I think the more important point is that the monitor itself is really aggressively priced. The nearest thing to it on the market is apparently an eight thousand dollar Dell that doesn’t technically qualify as a reference monitor. With true reference monitors being insanely expensive 5 digit specialty products. And if you genuinely need that sort of thing, like say color correcting for 8k cinema projection, you probably aren’t going to switch to Apple’s 6k (for reasons) not quite the specs you need model just cause its massively cheaper. But it just became plausible to stick reference monitors on a whole bunch of shit where it would be useful but was previously not practical.

They’re gonna sell a lot of these screens and for a long time Apple’s screens have been their last significant toe hold in the professional creative market. But people aren’t gonna like paying $200 to not use Apple’s stand. Its just really bad marketing. Charge $200 more and include the adapter, or just make it VESA compatible out of the box and noone would care.

1 Like

it used to be that i needed to upgrade so my computer could process new media faster. My early computers had no need to playback internet video for instance… then video became higher res. but in like the past 7 years, no new media has come out that my computer can’t handle. And considering we are pretty much at movie theater resolution now and no one can really perceive much more than that on a screen, I can’t imagine why I would need a faster machine. I mean I’m not a gamer, but i suppose if I was it’s getting hard to imagine even video game graphics getting any more processor heavy at this point. They look about as realistic as movies already. I’m saying, it’s getting to the point where it’s hard to imagine why you’d need more… my iphone can hold every mp3 (and digitized CD) I’ve ever purchased. I guess they keep making the camera’s better but it’s pretty minor… I can’t see the difference. Most consumers just watch youtube videos and check facebook on a digital device and for most people devices made 6 years ago already do that perfectly.

1 Like

But Apple doesn’t offer contract/enterprise pricing or proper enterprise level tech support.

I could see that being a problem for some places, maybe… But the original MacPro had no such care scheme and the shops ate them up. Like you say, these places often buy or lease such gear from a third parties who provide on-site maintenance. It worked in the past, so I can’t see why it wouldn’t this time. Plus, apple has a slightly better maintenance reputation than say, HP. But frankly, that might not be true any more.

In my experience, shops cater to the talent in most cases and buy what they want to use, even if it comes at a premium. In the past, creatives liked to work most on Mac OS and there was an ecosystem of pro apps that catered to them. Now that’s changed to a degree so I could see a lack demand for these work stations. No nVidia support and several magor apps that have killed their mac OS ports… that’s a bigger issue to my mind.

This has never been Apple’s strong suit, else we wouldn’t have been waiting for this stupid machine for the last six years!

ETA - You’re totally right about a lack of enterprise pricing… that’s just crazy.

1 Like

Bingo!

Here’s how it plays out:

  • MarCom submits requisition for Mac Pro, including display and $999 stand.
  • Accounting rejects requisition - no enterprise pricing! Apple isn’t an approved vendor!
  • VP of Marketing calls (or emails) VP Finance or CFO, reminding them of the last great product launch, and how they both have bonuses tied to the successful launch of upcoming Widget X.
  • VP Finance/CFO send email to Accounting, “order the damn Mac Pro!”

I agree, but these are for a specific client.

I’ve been cutting TV for 25 years and building studios since 2012. I’ve had allot of clients that would laugh at the thought of spending $1k on a stand. That’s silly for some. But for others when you consider that this thing is supposed to replace a broadcast reference monitor. (Effectively replacing a dual monitor setup) the “the price is right” at about 1/2 the cost of a decent Sony and they wouldn’t think twice if it saves them over buying a monitor and a reference display. Separating the two was silly, but also so is their price. They should have just made the monitor $1k more or I dunno not been stupid greedy? Seems like a way to “pad the sale”.

If you only want one monitor on your desk and no IO “dongle” to your broadcast monitor this thing is pretty sweet and a deal. That said I’ve never worked on a single monitor setup; certainly not online editing. Maybe in the offline edit, but even there I find it to be a bad habit having only one display to cut on. In the days of “lazy editing” we seem to have forgotten (along with many other things) what a reference monitor is for, which is more than just watching the show down at the end of the edit. A large reference is so that you’re offline editor doesn’t choose stupid shots with the camera man in it or someone wearing a stupid t-shirt because they cut the damned show in a 6” portion of their $6k monitor. You’re supposed to keep you’re eyes on the monitor not you’re computer display or the timeline. Avid is setup that way, but as applications like FCP and AP have taken off, bad habits like this enter the workflow and online editors like myself have to cringe when fixing bad edits that are clearly done by amateurs that don’t consider themselves amateurs at all. Same thing goes for a decent pair of audio reference monitors. Don’t cut with your laptop speakers or your earbuds.

I would never recommend cutting without a decent reference monitor. Does every editor need “the best monitor that ever was”? No. Sometimes bigger is good enough. A colorist or finishing editor would like this, but even they would use a 2nd display so in a two monitor setup the value is a little dubious. At least for both displays. This monitor is good at replacing a $15k reference monitor, but as a primary display? It’s a bit overkill especially if you’re building 10-20 suites. And I would still package reference monitors to go along with an iMac “offline” machine, but I would probably recommend a $2-3k monitor in that instance.

The new Mac pros are a little expensive compared to what they were last in 2012, but they’re actually pretty much on point for what they are. I’d recommended the Apple display to a client in a package with a tower, a decent technicolor certified monitor for primary and the Apple reference monitor for the secondary IF they needed to finish or color a pic. Otherwise another $2,500 monitor is probably good enough.

3 Likes

Probably, but we won’t be the fools who buy them.

For $1000 I’d expect something like my ergotron monitor arm, but better looking.

A couple of major exceptions:

SSDs provide instant access to data:
Every so often, WiFI and cellular networks improve
Screens have gotten a lot better since then
4k video-- and the DRM required by certain streaming providers for the 4k option.

Most of these are readily available at the low end of the market-- if you know to ask. They weren’t six years ago.

1 Like

When I saw it was a 6K monitor…I thought that was the pixels not the price.

1 Like

Well like I said a huge part of that is there hasn’t been all that much advancement in terms of processors over that specific span of time. About 8 years back AMD shit the bed with their processor line, producing something that was just not at all competitive with what Intel was selling. Lacking the funds to fix it they basically let it hang out there for 6 years and plowed everything they had into starting over. In the meantime Intel didn’t do a whole hell of a lot to innovate or push performance higher. And owing to some snags of their own they’ve basically been pushing slight tweaks of the same processors for the last 4-5 years.

Basically the very period you’re talking about there wasn’t really anything in the way of meaningful increases in CPU capabilities. Just things that could do the same stuff, the same way, but ever so slightly more quickly (and increasingly at higher prices). Even with AMD’s new stuff out they’re pretty much at parity with Intel for the moment on actual speeds. But AMD is pushing additional cores and multi-threading up and down their product line. Basically pushing a baseline of 8 rather than 4 in their consumer chips. Basically its not that there’s nothing else consumer software can use resource wise, rather software isn’t built to take advantage of more. Because more didn’t exist at the consumer level.

That’s what I was getting at above with software will begin to take advantage of it, and then require it. The very same thing happened when 64 bit and multi-core CPUs first launched. At first it was nice to have and things here and there would use it to get better performance. By now you probably couldn’t run even your web browser on a single core processor. The software grew to take advantage of what was available, and no further. And at a certain point that meant older stuff just wasn’t enough.

Intel spent the last decade or so pushing 2 and 4 core processors, with no multi-threading as the consumer default. So most consumer software can’t take advantage of more than 4 cores, or use multi-threading. That’s started to change the last few years, and as a result a lot of 2 core processors are starting to get pretty reedy. And with Intel now pushing multi-threading into processors besides their top of the line ones, and adding cores to “mainstream” chips, that’s going to continue.

Its entirely possible that the same bit of software will no longer “do that perfectly” on that 6 year old chip in just a few years. Because that software will be looking for bits that are not there.

If memory serves in the pre-Intel Mac days they did quite a bit of that, especially in academic and educational markets. Or they at least did more on that front than competitors did, I was repeatedly told so as a Film student and while working at a research hospital. And those were the days where Apple wouldn’t just sell you computers in a block, and help install them. But build out the whole room for you with special Apple branded sliding glass doors, and glass walls (if you paid enough).

We’ll put it this way. The last time I saw a 100% Mac video/film production facility was with the Power Mac G5 tower, the MacPro was basically the Intel refresh of that unit. By the late 90’s and early 00’s most of the Mac exclusive professional software packages out there had gone cross platform. The shops that were eating up the MacPro were the remaining shops that were already locked in on Apple. And it was that generation of the device where that presence would erode. The previous generations were where that lead was built. And at that point in time (G5 Cheese grater days) there weren’t a whole lot of competing options in terms of pre-built off the shelf work stations of that sort. Especially when it came to the whole multi-processor plug them together and run em as a cluster thing.

It was a very, very, very different market than where these things sit now. And if direct competitors are offering a dedicated, on call, tech monkey to come un-fuck your shit at 2am if need be. Apple’s gonna need some night shift primates.

I mean the press. A lot of tech sites, and Rob here, are saying that the worries people have about this revision. Or the points where the prices are obviously fucked (the stand and VESA mount) are not real because “its for the pros”.

But none of these venues seem to know what pros actually use or pay for these things, nor do they seem to have made an effort to find out. And if it were not Apple we were talking about it wouldn’t be getting dismissed with a “for pros” hand wave.

All I’m saying is ask some one who knows. If you’re trying to make sense of this release from that perspective, it doesn’t really work without the context.

For my part I’m having difficulties reconciling the points of this that are obviously targeted at consumers/prosumers and the freelancers or independents that have been calling for another go round on the original MacPro. And the pricing and other bits that are clearly targeted at another tier up and large scale corporate buyers, without the apparent support systems that are now standard there.

Whole thing is a bit messy. It looks like they tried to split the baby a bit.

I’m assuming that’s what they’re thinking.

But I’ve never seen a broadcast reference monitor mounted on anything but pretty standard VESA mounts at a fraction of the cost. And again the way ordering for this stuff goes is not “well its still cheaper than the regular monitor so I’ll just add this separate line item at $1000. its fine”. You’re pricing this shit out you’re gonna go for the VESA adapter and apply that cash to a different item (perhaps more monitors with VESA adapters instead of $1000 add ons). And the $200 up-charge to use the damn thing is still gonna grate, because almost any other screen you buy is just compatible out of the box.

Like I said its a marketing fail. The total package still costs less than the nearest product from someone else. And vastly less than typical reference monitors. Put all of it in a box together. Fancy stand and adapter. Charge $6200 for the damn thing. And people would Marvel at the value. Break it out into a series of upcharges and it starts to look sketchy. They can sell this screen for $5k. Which means they could sell that screen for $5k VESA compatible out of the box. By adding a couple extra screw holes to the casing. Basically they let people see the gouging (or the Luxary Brand marketing if you wanna be nice). Which is dumb.

Still gonna a sell a fuck ton of these.

I don’t think its meant to, at the very least the very pricy one is gonna beat it on certain specs or size or some metric. And if you really need that sort of thing you may very well need to pony up for that. I think its meant to proliferate them. Reference monitors have their use in Video as non-primary displays for exactly what it says on the box. Reference. But now you can have them places where they would be useful but not absolutely necessary. Like all your displays.

And this sort of thing is used heavily in graphic design, professional photo and publishing work. Any TV production company I worked at most of the work stations did not have a reference monitor, any place doing web or corporate/institutional work like wise. The big, full, Avid stations had them (if the company had those). Everything else just had good but not reference good monitors (frequently Apple or Dell). Now a lot more of those work stations can have them. A lot of those graphic design and non-video visual arts folks can justify a proper reference monitor instead of getting one that’s generally close enough or as good as you can afford.

Basically expand the capabilities of a tier down from that big ole Avid station.


Apple’s got this weird dedication to alternative, non-standard resolutions. Almost everything is just slightly off the base line. 3k, 5k, 6k. Down to “bigger is better” marketing and the weird way they keep shifting their Retina, and focus on pixel density. Its weird cause literally no media is produced at the resolutions, and the only reason anything (apps, webpages etc) are built to work well with them is to make sure shit shows up right on Apple devices. And chances are you aren’t really gonna be able to see much difference quality wise.

But I understand your graphics design and visual arts people like the extra working room they get from the added pixels.

2 Likes

I’d say the iPhone profit stream killed off any dedication Apple had to big iron. They bought up killer apps for video editing, compositing, professional photography, music creation to get people to buy big macs, then killed them. You just can’t trust Apple to work with you…on anything.

1 Like

And with more cores, cache, etc. If anything it seems like clock speeds have been decreasing.

Thanks again to Oglaf, I have a reference for fucking tree branches. The garden is not corroborating except to have red maples blow away the zinnias for growth hacking and whatnot; every day it’s another 8x8 account for the maples and their branches and their Jaguar SUVs with scheduled maintenance included.

Uh, and we apparently, instead of childcare or vision that pays optometrists and eye surgeons, need a solarium full of iPadOS for deciduous or something. That was since some of them discovered ‘professional’ meant the things at Lowe’s and Dell were half the given price but needed to have an annual reported markup somehow. This meshes well with the review for a nice 23 or 27" 4K LG hazwire monitor (hazwire -passthrough- and 3 USB-C rather) “like other $300 monitors” that Apple offers for $669. Fully expect the professional HomePod to talk out the side of its mouth about new options in finance, the Schumpeter might have all leaked out of the…Apple App Store?

@RyuThrowsStuff so that’s the setting! No more (uncut) paintings on the back of the monitor, you need the -__wire2 and Displayport things to be daisy-chaining and letting down 96W as they do. There is an implicit understanding that you need to pay a fine artist an other half of the costs to make and apertureise their art.

Went to the last vendor I saw for an Opteron server, they were selling $400 remade Lenovo 440s. But with Win10 instead of OpenBSD, and no LibreBoot. So I have to believe the AMD Epic systems were in another retail venue, which one has to know how Cydia works to access.

1 Like