Read through it a bit more. Damn, it's ghetto that it doesn't expose the correct resolution by default, and that text is ballsed up in non-"retina" aware applications. Christ I hate the term "retina" to describe displays, it drives me up the wall.
The article clearly explains why the resolutions isn't exposed to the user. It seems to be the natural choice. Also, who cares that it's called Retina? It's just a name, and the marketing is correct assuming you're looking at it at the right distance =P.
Since the user interacts with these three devices at different distances, this makes sense.
The idea is that, assuming the device is a normal distance from you, and you have normal eyesight, the resolution is sufficiently high that you won't distinguish pixels.
The normal distance for a laptop isn't enough for this to meet that hurdle. They've thus changed retina from reasonably truthful marketing gimmick to misleading/false advertising.
I made a table for reading the minimum rows of pixels required for the iPhone 4's "Retina" effect (at 10") from a specified distance.
As you can read from the table, a 16:10 15.4" screen needs 1922 rows at 14" and 1682 rows at 16" away. Since the "Retina" MBP has 1800 rows (from 2880x1800), I say that it's "Retina" from 15" away.
You can also find that a cheap 22" 1080p monitor is "Retina" if you're about 33" away. But the moment you get closer than that, your 20/20 vision can discern individual pixels.
It's a fun table to play with. If you care about the technical junk, Apple had a slide about it in the iPad 3 reveal:
This whole thing seems a bit wasteful here, to be honest. For example, I decided to measure how far I sit from my 15.4" MBP during normal usage and came out with ~26 inches. So according to your chart, my 1680x1050 screen is ALREADY a "retina" display! And certainly I can't discern individual pixels on it under normal circumstances. Heck, I sit about 30 inches from my 27" 2560x1440 display so it comes pretty close.
I'm frankly much more interested in the IPS aspects, especially viewing angles. The MBP already had one of the best panels available in a notebook from color and contrast perspectives, so further improvements there will just be icing.
I'm also interested in how they plan to keep a GT 650M and 45W IVB quad-core cool in a 0.76" chassis, when the current SNB models are 1" thick, sound like 747s under load, and still run up to nearly 90C.
Apple screens are Retina (tm) (Uppercase "R"). Any screen, including Apple's, can be "retina" (lowercase). The term existed prior to Apple's co-opting of it.
Where does the article "clearly explain" why native resolution isn't exposed to the user? The writer only hints at the fact that portal 2's text dialog window is very difficult to read at native resolution. Quit overstating.
Two months after release and we will have apps to completely expose the correct resolution to the system. I'm sure there will even be hacks that let you banish the word "retina" from system dialogs.
I'm with you, piroroadkill -- the "retina" term to describe high resolution Apple displays drives me up a wall, too. So does the use of non-standard SSDs, and other standard parts.
> How are they hurting you? > Deal with it, buy another company's product.
Well fuck, he (and I) probably will. Just because there are other options out there doesn't mean that we can't voice our opinions on what Apple releases.
they could easily fit the 1.8 to 2.5 difference. teardowns show that if you know anything about EE. Deal with it... probably not. Typical fanboy response instead of looking into the critical analysis.
Some people like not getting wallet raped for things they could have the higher level option of repairing. Aside from that, who the hell is going to benefit from a 2x bumped resolution in a form factor of the hyp of a triangle that at max is 15". people who read 3-4 tiny pdf pages? no, maybe 3d artists in wireframe mode? probably. photographers? lol, no. you can't edit on such a small, cramped screen, you might as well just use the iPad3, which is great for pdf reading and a much better form factor. overall the mbp 2012 q2 is a logical fail, and wallet rape for useless features. The lightness is probably a big win, but soldered in memory? why not just go ipad? this is an asus transformer with optional keyboard attachment as one paired with a screen that is too small for normal functionality...
with a tiny LIF 1.8" SSD that isn't even standardized. That would ALSO help techies and Apple themselves. How Apple would benefit? Production cost with standard parts would cut 15% off the pricetag, and repairs would be a snap.
People trash on Apple as not innovating because they use standard off the shelf parts and people trash on them for custom designed parts, they just can't with some people.
But more to your thoughts that there is no reason for this.....look at what it brought the last time Apple went crazy nonstandard with the Air. Apple created a new class of laptop that only now, 5 years later, are vendors who use "standard" parts catching up to.
The 2880x1800 display isn't standard either but I haven't read where you are up in arms about that. Hell, their SSD card is more standard in that it uses NAND, a Samsung controller, and a mSATA connector. I see no reason why anyone with the desire and know how can't make their own SSD cards to go into this machine.
That's true - for displaying your desktop - but displaying a 3D scene in modern games with loads of textures pushed into it at that resolution -- that just won't cut it -- But anyway, it's not a gaming rig so it's not expected of this product
lol. you would NEVER fold on a mobile processor, what the *** is the point of that? cuda or opencl on a restricted <560, come on get with the program. small desktop, 680 or even 690, 480s on sale in sli, and a quick z77 with something paired to it like a 6-12 core x2 threaded, you wouldn't use this eye garbage to do any serious work with it. It would pop at 90c and it's diode sensor would fail easily over intensive work. troll.
Programs like Final Cut X and Aperture heavily rely on the GPU for video decoding and rendering, and generating realtime image effects and animations using OpenCL.
In particular, the system requirements for Final Cut X specify OpenCL-capable graphics. People managed to hack Final Cut X onto old Macs without OpenCL graphics cards, which caused a fallback to CPU rendering. Instead of being able to layer on multiple effects while playing back an HD video in realtime, a single effect brought the program to a crawl.
I think it's safe to say that you wouldn't be doing much modern 3D gaming at 2880x1800 in this thing regardless of whether or not you have 2GB of vram. The extra resolution is there for displaying your desktop, while for gaming you've got pixel doubling at 1440x900.
Most games would not run at native retina resolution on the 650m regardless of the video RAM, I don't think 1GB will be the bottleneck. You'd be playing anything more demanding than Diablo at lower resolution.
I've read a few reviews on Diablo III where they used lower end GPU's and said that due to the style of the game that 20's were acceptable for gameplay,, obviously if it were a FPS, then I would defintely say Anand is on drugs,, but because I've seen it said in other places, I let it slide.
MBP isn't built for heavy duty gaming anyways, meant for professional work mostly.
"MBP isn't built for heavy duty gaming anyways, meant for professional work mostly." Then why does it use a consumer-grade GPU? Certainly were that the case, a Quadro or FireGL would make more sense, like ThinkPads and Precisions use?
Form factor, for one. The machines with those cards are enormous. Also, the benefit of those cards is only noticed in very specific applications like CAD and serious motion-graphics. I would think this audience constitutes a very small percentage of those interested in purchasing this laptop.
It depends on the type of game and then the game itself. Diablo isn't a first person shooter, it plays perfectly fine in most scenes at 20fps. There just isn't much happening on the screen that needs to be drawn more than 20 times per second.
Anand isn't the first person to cite 20 frames as playable on Diablo III.
It's 20fps in the heaviest scenes in Diablo III. Meaning most other scenes are higher and the actual average fps you'll experience across the whole Diablo III is more than 20 fps.
No its 20's early in the game. Anybody who has actually played the game through knows the battles become bigger with many more enemies and effects going on. The first Act does not give a proper indication of whats acceptable. Anand shouldn't be saying its "ok" for somnething like D III when in the long run it won't be.
It's not a twitchy shooter, I think casual gamers would be fine with 20 (so long as the dips weren't too big or frequent). Anyways you could run it at a lower res and get a much better frame rate.
You've gotta be kidding me. OBVIOUSLY Anand is not saying that you should buy a Retina MBP so you can play Diablo at 2880x1800. It's a curiosity, an academic discussion. "Hehe, Diablo loads at 2880x1800, and I can run around town! Cool!"
Would you have preferred if Anand was like "I tried playing Diablo III at 2880x1800 on a 0.7" thick laptop, but it's nowhere near playable in intense battle scenes in the later levels! I am extremely disappointed in Apple! This is an absolute shame! This computer is in no way a replacement for an equivalently priced Crossfire or SLI equipped desktop running dual 1920x1200 monitors. Therefore I say Apple has ultimately failed in their quest to bring premier ultra high resolution gaming to the thin and portable notebook format."
The game becomes more taxing on graphics as you advance through acts and difficulty. Based on anecdotal evidence (noted in the Update section of http://www.anandtech.com/Show/Index/5865?cPage=2&a... ), later acts and difficulties are more taxing.
Right... I'm sure you know more than Apple engineers who designed this thing. If they found that 1GB is plenty for great performance, what makes you think you know better?
It probably comes down to the same thing it is every time. Apple makes the software and it runs a ton better than what you'd expect just looking at the numbers.
For those who use a laptop as their primary computing device and dock it when at a desk, you have a dilemma with this laptop. With the integrated display having a higher resolution than any external display, it seems counter intuitive that when you're at a desk you would connect the laptop to a dock and use a lower resolution external display. Are the currently available external displays still better than the integrated high-resolution display in this laptop?
Well the MacBook Pro scales everything. Because otherwise 2880x1800 on a 15" screen will make things WAY too small, you could never read the text here for instance without getting extremely close to the display, it's not at all practical. So everything on the MacBook Pro is just scaled larger (giving you less work space), so you'll still get more stuff shown to you on a 30" 2560x1600 desktop display for instance - you'll have more work area.
And obviously high-end desktop displays will have things like much higher colour gamut, more accurate colours, higher brightness and so on.
Even if there were, it is besides the point. At most you effectively have a 1920x1200 "res" with the laptop, in a desktop environment. So you have every good reason to connect it to an external display as before.
Or placing the laptop on a platform 10 inches away of your nose and the keyboard underneath the platform. At least this way you could save in the electricity bill.
I'm pretty sure that IGZO doesn't have anything to do with the pixel matrix. IGZO refers to the transistors that drive the panel, not the crystals that form the picture. Right now IGZO was shown with *VA and OLED technology, but I don't see any reason why TN and IPS shouldn't also be used in the future. If someone knows more about this, please correct me. :D
> At 1440 x 900 you don't get any increase in desktop resolution compared > to a standard 15-inch MacBook Pro, but everything is ridiculously crisp.
This word, resolution, I do not think it means what you think it means.
You ARE getting an increase in resolution, that's why it looks crisp. You are NOT getting any increase in desktop SIZE. Your glyphs are the same size (in mm or whatever) but are being RESOLVED at a greater PPI. They are simply faking out applications with an artificial screen size. . .
I know. I hate that as well, but it's become industry convention to use "resolution" to describe pixel count AND the size of UI elements. I feel your pain.
i think what he meant was that desktop size will still be the same because everything is scaled up when ur at a higher rez in osx
for example im on my hp now and i can fit about 20 icons horizontally if my rez magically became 1080p the icons would looks smaller and thus more icons can fit
desktop size wont change things will only look more crisp
I wonder what definition of resolution you are applying. Here you are the definition of image resolution in Wikipedia:
- Image resolution is an umbrella term that describes the detail an image holds. The term applies to raster digital images, film images, and other types of images. Higher resolution means more image detail. http://en.wikipedia.org/wiki/Image_resolution
According to this definition Anand is right. You don't get any more detail by replacing a pixel by four with exactly the same information. For instance, a black screen has null resolution no matter how many pixels and close they are because you can't resolve any line pair in such screen.
A similar criterion applies to: - Optical resolution, the capability of an optical system to distinguish, find, or record details - Resolution (mass spectrometry) the ability to distinguish peaks in a mass spectrum - Angular resolution, the capability of an optical or other sensor to discern small objects - Spectral resolution, the capability of an optical system to distinguish different frequencies Sensor resolution, the smallest change a sensor can detect in the quantity that it is measuring http://en.wikipedia.org/wiki/Resolution
Rendering off screen and downscaling is a clever workaround to male the new display more versatile, and the fact that they went IPS is just lovely. Why did they not update any of the Airs tho?? The pricd drop is nice but a 1080p (or x1200!) IPS with a $100 price hike would've been just as swell.
How much work is involved in updating 3rd party apps for the new display?
Clever it might be but it's not resolution independence. They are just workarounds. And Apple cannot expect the entire world to code 2 versions of apps/sites just to appease their niche high-res laptop.
...almost like Portal 2 is rendering a 2880x1800 image to a 1440x900 surface presented by the HiDPI support, and then HiDPI is scaling back up to 2880x1800.
That would be about the only way to explain the pixelation and the unreadable console.
Indeed. The Portal screengrab showing the console is at 3840 x 2400 resolution, while the diablo ones are at 2880x1800. There must be some upscaling going on...
I think he said he was playing at 1920x1200, so that kind of makes sense. That's a little too much mangling to be anything other than a total bug, though, it'd have to be rendered at something like 720x480 to be that bad.
Seems to me that this machine would have been much better off with a 1920x1200 IPS panel that it was actually capable of pushing as opposed to a 2880x1800 panel that the included 650m can barely handle...
That and not having the RAM soldered to the circuit board might have made this a decent upgrade for those of us with older MBP's...
As is you get a screen that looks really nice but the computer is too underpowered to do much with...
I don't see how average user will benefit from increased DPI. I can tell how will they benefit from viewing angles, increased contrast and color accuracy and more screen estate but not from more dense DPI.
All this "my resolution is better than yours" is getting out of the hand, kind of like CPUs in Android phones; quad core, hexa core, etc. Completely irrelevant its the software that is either efficient or not.
Also reminds me when Galaxy II made a debut and everyone was praising the screen and I asked myself did they even see the screen? The screen can't reproduce pure white! Everything is tinted in either blue or yellow depending on the viewing angle. Yet the screen got praised all the way to heavens. Why? Because of the resolution.
If this Retina came as standard and not at $500 premium I will be all for it, but as an option it's not worth it.
Actually, the reason I love AMOLED screens is 1- the pure blacks, 2- the excellent contrast, and 3- that they don't act as a lamp when reading in bed at night. I don't care much about color fidelity on a phone. Resolution is fine too, but that's secondary.
What's the point of higher resolution, especially when you can't use the native resolution? Dot pitch is gone! You won't see individual pixels at a normal distance.
This is truly impressive - a laptop screen which approaches the dpi of my phone.
Does it improve usability directly? No. Does it enable a near-flawless screen experience? Yes.
PC makers will bring you a screen like this without a $500 premium, just wait around a bit...
"PC makers will bring you a screen like this without a $500 premium, just wait around a bit..."
Because vendors have been right on Apple's ass to offer the same or better resolutions in IPS displays after they introduced the iPhone 4 and IPad (3)¡
It pains me to say this, but PC makers will never get it right. They've had a hell of a lot of time to get it right, but they fall flat on their faces. They constantly release "ultrabooks" with 1366x768 resolutions, which is an absolute, fucking JOKE. What year is this? 2001? I mean, seriously?
SGS2 had a mediocre resolution when it debuted. Many smartphones of the time had 960x540 resolution and even 720p phones were announced then. The reason the screen was praised was for its increased sub-pixel resolution compared to previous OLED smartphone devices and for being OLED which a lot of people like.
You realize that $500 premium is actually a $400 premium, and includes a good quality Samsung 256GB solid state drive, right? Not to mention that the screen is IPS with markedly better contrast ratios and viewing angles than virtually anything else out there.
The average user will benefit because in most application (those that don't do special text rendering but take advantage of the OSX apis to render for them) the text will be rendered much nicer. This is the same as what happened with the iPhone and iPad retina display, in most case with no work from the developer the application is much crisper for text rendering. Then with some work on the graphical resources (by including 4x resolution resources with a special naming convention but with no additional line of code) all the graphics look much sharper. The constant between all the apple Retina displays is not the marketing behind it is the way the APIs handle it. Basically for most high level API the resolution is standard but things (lines, text) are much crisper, then when you want you can specifically render things at double resolution using a new set of APIs which were introduced with the iPhone 4 (and which are the same on the Mac)
Probably great. Just set the DPI to 220. Of course, some apps won't scale right and will look funky (some text is tiny, some text runs off screen), but most apps are fine. I've been running a 130dpi setup for a few years now and wouldn't go back to 96dpi. It's amazing how different fonts look at higher dpis.
I would love to run Windows 7 or 8 on a display like this, but I don't want the Apple keyboard or proprietary SSD (and other hardware). I wish PC makers would get with the program.
The display is great. The machine looks great. Finally 16 GB's of RAM. All SSD's for storage. Check, Check, Check...but why only a 650M? I'm guessing they just can't cool a 680M or even a 660M? I just don't get it. They don't seem to cut too many corners but they never give you options (at least good ones for gaming) when it comes to graphics...
I hope more manufacturers start offering ssd and the 16gb ram is nice. I do think the display is overkill on a portable unit but whatever, I'm sure it'll many people fine.
As for the graphics, sounds about right... it'll play angry birds, ninja fruit just fine.
Probably heat or cost or both. The 650 is a good improvement from the 6770 though. It's not made to be a gaming machine although it can clearly handle a decent level of gaming, so for most the 650 is fine.
It's running Diablo 3 at 27fps in that picture -- one of the least demanding new titles ever released (exaggerating a bit here) -- hardly a "decent level of gaming" considering it isn't even rendering any monsters...
Apple's engineers clearly never heard the story of Goldilocks and the three bears and finding what was "just right" in regards to screen resolution....
This upgrade amounts to paying $500 for a screen which the hardware is too slow to run and then losing the ability to upgrade your own RAM and HD...
I own a Macbook Pro now and would love (and can easily afford) a new one -- I'll pass...
Did you read the article? The laptop is running Diablo at native resolution at 27 fps, or in other words, driving more pixels than any other laptop. Ever. Made. (that I'm aware of, anyone know better?)
With the ability to turn down the resolution to a more normal 1920x1200 or even 1280x800 for more demanding games, and minimal artifacting going by other programs, it should be fine as a gaming unit. Not top of the line, but medium/high settings for 2012 games fine.
Did you read the article and actually play Diablo 3? If he's getting 27 fps doing nothing -- what is he going to get in Act 3 Siegebreaker with 40 monsters on the screen? So no -- it won't play Diablo 3 at that resolution...
So your suggestion is turn down the resolution to make it work? Considering you'd have to run 1440x900 to avoid major pixel interpolation problems what was the point of spending all the extra money for that screen again?
The point is - Apple overshot. The hardware in the computer isn't up to snuff for 2880x1800. Most people running that resolution would have an SLI or Crossfire setup. The notebook should have came with a 1920x1200 IPS LCD panel - period.
You're making the mistake of assuming that Apple is targeting gamers with this system. Apple isn't (when do they ever?). At best, being able to game is a nice 'perk' (I've played all of Act 3- and most parts of other acts- on a system that would only pull low-20 fps when idle in Act 1... it was perfectly playable).
If you're doing anything that involves text, using a system with a very high DPI will make it much easier to use for extended periods of time. In addition, if you did iOS development, you'd actually be able to do development for the iPad 3 without having to scale the crap out of the simulator.
Having very high DPI will also be wonderful for anyone that works in Photoshop (or similar programs). I don't know exactly how Lion's scaling will impact Photoshop, but I'm sure both Adobe and Apple will be working to make sure it works well.
There might also be more controls/functionality included with Mountain Lion. I wouldn't be surprised if Lion just has a very limited version of ML control/APIs.
>> You're making the mistake of assuming that Apple is targeting gamers with this system. Apple isn't
Then why did they show Diablo III during the keynote?
Also, I'm not at all convinced that high-DPI is automatically great for Photoshop. E.g. if you're making content for the web, you don't want to be viewing pictures at a totally different DPI from everyone else.
So if I'm reading this correctly they give you a 2880x1800 native resolution screen but then don't allow you to actually display the desktop at that resolution?
Lets sell a car advertised as being able to reach 100mph but really electronically limit it to 50mph and then double the readout on the speedometer...
I wonder if the bootcamp driver imposes the same limits on windows?
No I don't need to read up. Apple is saying no matter what my use case is that I cant have access to 2880x1800 native even if I don't care about their text size. What if I want to do medical imaging or photo editing...
Sounds to me like they have an OS that cant scale text so they decided to do this pixel doubling BS.
For apps that are tweaked to so they are "Retina aware" will display their video/images at full native resolution, while the text and UI elements are scaled so they are still readable. Apps like iMovie, Aperture, FCPX and Photoshop CS6 (free update is on it's way) will all support the native "Retina" resolution for the video/image you are editing.
For apps that have not been tweaked for "Retina aware", OS X will assume all aspects of the app need to be scaled so it is readable at the 1800 resolution.
The example you've given is actually real, many car models have their max peed electronically limited (e.g., BMWs, Mercedes-Benz's, Audi's, even Tesla's). Anyway, it's likely Apple is intended to offer native resolution for the OS desktop in a future OS iteration.
The closet competitor to this would be the Vaio Z. I'd like to see it compared to it's full hd screen. Comparing it to some ultrabooks isn't exactly the best comparison. The Z, though not considered an ultrabook, is lighter than this and just as, if not more powerful.
It's not remotely reasonable to expect such high resolution gaming on an ultra-thin device. Not when we barely get these resolutions in a LOT of games in DUAL GTX680 mode on the desktop.
Which is exactly why this "retina" display is such a joke in this notebook -- it doesn't have the power to run today's applications let alone tomorrow's at that resolution...
Does anyone have the belief 1920x1200 is a low resolution in a 15" screen? Does anyone really believe the current Macbook PRO was too thick? Why couldn't Apple have used the additional space for a bigger battery and better cooling allowing better graphics on their "PRO" model?
No...the joke is actually that so many are overemphasizing "gaming" on a mac book pro. Gaming is not even remotely the target market for these notebooks. A 650m is PLENTY to run any non-gaming applications now and for years to come. ANY gaming performance it does have is purely incidental to it's main purpose.
There are plenty of gaming notebooks out there folks. Macbook Pros are not and never were gaming notebooks. Please folks, season your expectations with some reason.
The more versatile the computer, the greater market niche you can reach. Larger sales means lower costs per unit and larger profits. High end laptops won't disappear anytime soon, desktops used as domestic computers will, being replaced by laptops like this. The more powerful you make them the more people will ditch their current desktops and replace them by a top notch laptop, but you have to sell it in the first place.
Last I checked it's called the Macbook "Pro" -- Professionals might have need for graphics horsepower in situations other than games. Any "pro" is going to want his machine running at the native resolution...
Quite frankly - 2880x1800 on a 15" notebook with these specs is a joke.
There was a question posted here regarding the performance of the glossy display relative to a matte one. It has been removed, along with a number of comments pointing out that as a picture included in the article showed, glare was indeed an issue with this display, at least in contrast to what could be expected from a matte display. The picture which so perfectly highlighted the issue with glare on the glossy display also appears to have been removed and replaced with one that shows far less glare.
Is anandtech in Apple's pocket now, or something? Are we not allowed to point out the issues with glossy displays and glare? Are images which highlight these issues being taken down at the request of Apple, and replaced with subjective "well the glare really doesn't bother me" glad-handing?
If you've got any journalistic integrity at all you'll restore the previous picture at least as a counterpoint to the discussion on glare, and explain why comments pointing out that a glossy display will produce more glare appear to have been removed.
Anyway, people would eventually discover that awful glare as soon as they went to an Apple store to have a look at the laptop. You can't hide certain things.
Interesting implementation of the 2880x1800 panel. I was expecting it to be full 2880x1800 resolution for the desktop with icons and apps scaling to this while keeping the quadrupled usable surface area. This implementation gives you more nuance since you have 4x the rendering capacity, but it doesn't do anything for increasing the usable desktop area.
I am guessing there will be an OSX update in the near future once the ecosystem matures more? or not?
Well, this makes the VAIO S15 look like a bargain. For $1000 you get Ivy Bridge, matte FullHD IPS, mSATA SSD capability, under 2kg weight, and standardized RAM/HDD/Optical bay for 2nd HDD.
Seems people are buying it for the 2880x1800 panel and running it at 1920x1200 anyway. Then there's the issue of the humongous 96WH battery, and the charging times and power brick size issues for this "field workstation."
For me, the MBA13 would be perfect if it came with a matte screen, mSATA SSD, and 4GB onboard+1 open DIMM slot. Apparently that's "not the way Apple does things". They think glossy is best, end of story. By design they discourage users to buy memory and storage from reputable brands such as Kingston, OCZ, etc. Their iOS products do not take microSD cards even though it's the most obvious benefit from a user's POV. Apple is a highly profitable company by their consistent and persistent greed.
I think Anand is quite short-sighted to lust for the latest and the "best". If you work for the sake of diversity and technology, you would be honest about your needs and be able strike a balance of good performance, cost, and supporting open industry standards, without resorting to elitest, locked down black-boxes of beauty. If you need a retina display for your work, which I assume is writing this blog, then you are saying you are too good for AMD, Ubuntu, and the rest of the alternatives/underdogs. You are not doing anything for the future of the industry, but merely a slave to vanity and laziness.
I think Anand got the scaling mechanism very wrong.
When you set it to the scaling equivalent to 1920*1200, it still renders directly to the actual 2880*1800 resolution (if the program hasn't screwed up).
Mac OS X will just provide a different UI scale factor to the Quartz framework for this display than the default 2 (1440 -> 2880), namely 1.5 (1920 -> 2880).
But the scaling appears to happen *before* rendering the actual pixel image, not *afterwards*. Just look at the Retina screen in that scaling mode zoomed up with CTRL-Scroll. No pixel scaling artefacts, but there would be such artefacts if Anand's theory actualy held true.
It seems the Application "sees" 1920*1200 "Points" as its *logical* resolution but when it tells Mac OS X to actually draw text (for instance), Quartz will actually scale it to the real resolution *before* starting to render pixels.
Anand's theory would amount to a massive waste of CPU and GPU power, massive waste of RAM (oversized intermediate pixel buffers) and compromised output quality.
But apparently output is always fast and perfectly crisp regardless of how you set the scaling slider.
Only when you run into misbehaving apps which make incorrect assumptions about the physical display resolution (ignoring the scale factor provided by the API), you'll see pixelated output, and in *that* case the medium setting provides the least annoying pixelation. Properly behaving apps seem to scale perfectly in all settings, however, as far as I've seen on the demo machine I've been examining.
Anand, could you please re-examine this?
I'm not *that* deep into the APIs, but the (somewhat) recent change to floating-point "Point" coordinates at the introduction of the iPhone 4 Retina display (instead of integer pixels before) and the "Scale" factor for each display is pretty clear about how this is done.
Open the screen shot .png file in Preview and view it at Actual Size. The screen shot is 3840 × 2400 pixels. That must be the size of the screen buffer. That is then scaled down to 2880 x 1800.
It's just like what would happen if you were to set a normal laptop that supports 1440 x 900 to a resolution like 1024 x 640. The screen buffer is 1024 x 640 and is scaled to 1440 x 900.
3840 × 2400 isn't too unreasonable. It's only 36 MB of the 1 GB VRAM. Graphics cards can support 4k x 4k textures which is much larger.
You don't see any pixel artifacts because the screen shot does not show what it would look like at 2880 x1800.
The full-size screen shots are indeed a plausible argument.
I've been used to the pixel zoom (CTRL-scroll) zooming into the *physical* pixel map, but on re-examination it seems that it zooms into the *virtual* pixel map on the Retina Display instead, so the lack of any visible scaling artefacts there is indeed *not* an argument against pixel scaling in the non-native scale setting.
With the limited array of software on the demo machine I couldn't verify this exactly even with an optical loupe, but the difficulty to discern any artefacts even under the loupe in all of the available scale settings basically removes quality degradation from being a practical concern.
So at this point I retract my earlier assertion that pixel scaling was *absent*. It seems that in the non-native scale settings it is being used to handle the scaled modes, but the main mechanism is still the logical Points coordinate system being independent of the physical resolution – that is why most well-behaved apps should not need any adaptation (apart from improved icons).
I'm with you, Jeff. I'd literally buy a PC version of this today, if it were available. My work is tied to the Microsoft platform though, so this wouldn't really be a realistic purchase for me, especially at such a high price.
It's a PC that happens to be made by Apple and comes with Mac OS X pre-installed, but there's nothing stopping you from just running Windows on Apple hardware these days.
First of all the macbook of any type is not a gaming rig. I don't think those who have been criticizing Anand really even know what they're saying or have really read anything on this site, nor play many Blizzard games. 20 fps at 2880x1800 is pretty good for any machine. It is anecdotal and posted in the Pipeline. I am impressed a 650m is even pushing any game at 2880x1800.
I have the first two Diablos and it goes without saying that as you get to the end of the game it starts to throw crazy numbers of foes at you and therefore more taxing on the system. Now I also have SCII and it runs at ~25 fps on my desktop and I never thought anything of it. Blizzard doesn't program they way most game developers do. They often don't have high requirements for hardware and they typically make 3rd person games.
I can understand the slider approach when not all applications have been updated to support the screen. Truth be told Macs are sold to old and young people and that native resolution is not going to fly all the time on a 15' screen. I would guess 90% of Mac users are not even going to change the res from 1440x900. They're gonna talk about how good the screen looks.
Disclaimer - No fanboy affiliation. If I lean towards any hardware, it would be AMD. Unfortunately they break my heart more often than I can tolerate and I buy according to what makes sense for me. Ok, I am an SSD fanboy - they have made all my machines so much more usable.
Professionals might actually run things like CAD or other graphically intense applications other than games. They might even want to run them at their native resolutions....games are simply a good way to stress the capabilities of a system. Usually if it plays games well - it does everything else great...
How exactly does this higher resolution screen benefit professionals again - by sapping more of their profits with its higher cost?
What does a CAD application have to do with desktop resolution?
And who is forcing professionals to buy Apple? It's not really a secret that Apple has been far from being a bargain vendor.
This website is insane.
Every. Single. Laptop review gets spammed with "need more resolution"-comments and now that somebody does extremely high DPI, people have to be negative about it.
I guess the web made it cool to be "anti" everything.
I'd also like to add a couple more ubiquitous complaints:
Display reviews: "TN panel - DON'T CARE"
Laptop reviews: "Base model includes 5400 RPM hard drive - DON'T CARE"
Laptop reviews: "Still too thick with ancient optical drive tech - DON'T CARE"
And so Apple offers an IPS panel, high performance SSD, and ridiculously thin profile for corresponding specs for a relatively cheap price, and people complain that the display resolution is too high! Meanwhile, you can just run the stupid thing at 1440x900 and still have all the other benefits... and yet we get 90% complaints.
In the diablo screen shot, upper left corner says FPS 27. With nothing but player character and a merc on screen and it's only 27 frame per seconds. It's barely playable at this frame rate, with mobs on screen it'll be a slide show.
"Apple chose to deal with the issues of glare and reflections by integrating an extremely bright panel with great black levels."
In other words, they chose to continue peddling shitty, glossy screens, taking cues from the third-tier schlock being sold at Best Buy.
"Great black levels" don't mean jack when the screen is covered by sheen of reflection in all lighting conditions. Even Apple's own marketing shots show the "deep blacks" being washed out by the asinine gloss on all of their products. Are these the "great black levels" we're supposed to be all jizzed about:
anti glare is achieved actually by adding a layer of film to the display. every display that has this layer shows white through a 'filter' and results in 'sparkles' rather than true white. Taking this display and throwing some anti glare goo on there would be like taking a ferrari and getting some home depot paint to change the color.
for the best experience display should be in a fairly dim area away from light sources which cause reflections.
when that cant be the case, you can apply the anti glare film, but then you are basically never going to have a clear picture again.
This is just maddening .... what the hell is wrong with PC laptop manufacturers? Given that PC laptops are not restricted to the control of Apple, there should have had MASSIVE display resolutions YEARS ago, before Apple came out with this.
I'm seriously tempted to switch to Apple just because of this, but reading about all the "restrictions" is really irritating, too. Why can some applications not "see" the 2880x1800 resolution? Why doesn't Apple use standard 2.5" SSDs?
There will never be a "best of both worlds," will there?
2.5" is too much for SSDs. I really like the "SSD stick" that Apple introduced with MacBook Air and would be really nice if it was a standard.
And "Retina aware" apps can use the full resolution. Final Cut (or is it iMovie) can show a full 1080p stream on preview mode, something it couldn't if was restricted to a "virtual 1440x900 mode". The OS just don't render everything at 2880x1440 because UI would be really small.
Games have always had a bad habit of coding for specific resolutions instead of adapting for any available resolution, so it's no surprise that they're the worst offenders. Changing aspect ratios is even worse. It seems like even AAA games are made by companies who all have identical monitors.
A question, but resolution 1440*900 is supported? becouse it is exactly half of 2880*1800, and 1 pixel is 4 pixel; so if you have a problem with a game you can go down on that resolution without problems about displays with the non native resolution. Another question, 1680*1050 and ect, it is a resolution scaling like Windows (100-125-150% dpi)? or is a really change of resolution?. If it isn't, how the display works at this resolution, is possible to use it? Becouse if i use Windows or game (i understand this isn't pc for gaming, but sometime i like to use it and i think use osx + Windows can give better experience), this thing become more important, especially 1440*900 non retiana.
The game can support whatever resolutions it wants, then it will use bilinear or bicubic resampling to get down to that size. 99% of games will go with native resizing support - and aside from text, it'll look better, because fuzzies look better than jaggies - and retro style games will already be using their own engines to make the screen look like it should.
Yes, but i ask how the resolution look like, try to use 1280*800 in a MacBook with 1440*900 native resolution and you see everything wrong becouse 1px is rappresented with 1.xx pixel . Using a 2880*1800 can give you the possibility to use a 1440*900 without a problem becouse 4pixel become exactly 1pixel and immages and text is on correctly focus like a display with 1440*900 native resolution. And other question, is how the other RESOLUTION (no scaling) looks like. Sorry for my bad english.
really? buying 2500 laptops to play 80 dollar video games. Sorry but im growing truly tired of reading comment after comment about expensive hardware and its gaming capabilities. Its not just this site. People actually do more important things with computers, believe it or not, than playing games.
audio editing video editing photography editing computational fluid dynamics CAD - computer aided design web design software engineering live conferencing
etc etc etc
if you want to play games... buy a Play-station. You'll save yourself about 2200 bucks..and have a much better experience. Thank you, stay in school and say no to drugs.
I don't understand why all the bashing about an underpowered GPU or overly expensive display.
Game in Windows at 1440x900 and you *shouldn't* have scaling artifacts and you've got an equivalent experience to an Alienware M14x. Is that a "joke" of a gaming machine? No, it's pretty legit.
Do anything non-gaming and take full advantage of the pixels - images AND text will look spectacular.
All in an enclosure that is lightyears ahead of the competition. It's not cheap, but you get what you pay for.
"If you have the means, I highly recommend picking one up"
Could you compare the new MBP display with the existing Dreamcolor and Premiercolor displays offered respectively on HP and Dell's existing workstation lines.
It runs considerably cooler than my 2010 MBP 17". As an example, I fired up Diablo III at 2880x1800 and with most settings at High, plus AA on. I'm only in Normal Act III, but I was able to play it comfortably, with it dropping to 20 FPS when there were a couple dozen monsters onscreen. From what I understand, dropping the resolution would increase the frame rate, even though it's still ultimately a 2880x1800 display.
This was fairly nice, but what actually impressed me was that I ran this test with the machine on my lap and it did not get uncomfortable. My 17" MBP can be uncomfortably hot under much lighter of a load.
And, the fan noise is much more of a white noise, rather than the typical turbine-spinning-up sounds from laptop fans. I had to put my ear close to the screen to make sure that's what it was.
It's a very impressive machine that seems to run cooler and quieter than its larger predecessors, yet have more CPU and GPU power.
I have a co-worker that recently bought a Vaio Z and now wants one of these. I have another co-worker that wants to know what the heat output is like while playing games. My personal question: how does the console window look with really small fonts?
In my brief experience, the cooling plus the new generation of processors is amazing. I can keep it on my laptop, even when gaming, and it doesn't get as hot as my 2010 MBP 17" does. The asymmetrical fans do make a big difference as well: the noise is more of a white noise which is noticeably subtler than older fans.
Another factor that's new to the Mac (at least since my 2010 model) is that it switches between Intel and GPU graphics on the fly, which also probably helps keep things cooler than if you had to choose 100% GPU.
A quick test: I ran a process in R that pegged all 8 "cores", and by about 30 seconds, the CPU temperature was up to around 200 F, by about 45 seconds the fans had spun up to about 3,000 RPM and were definitely audible, though more of a white noise. I stopped the process at that point (the fans spun up to around 4,000 at their peak), and 40 seconds later, the temperature was back down to about 140 F. All of this while the machine was on my lap and it got warm, but not uncomfortable.
No matte display, no purchase. At least as an option. Even if more expensive. Sign the petition at MacMatte (matte petition) http://macmatte.wordpress.com
Correct me if I'm wrong, but that "resolution" setting that you keep talking about isn't actually physical resolution at all, is it? It's basically equivalent to Windows 7's font size control, but with more predefined options, and it starts out at a good medium for everyone instead of at the smallest possible size. It doesn't appear to label them with numbers anywhere, so did you just best-guess your way to the numbers? It sounds like no matter what setting you select, the panel will always run at its maximum resolution, it just changes the size of font and UI element rendering. It also makes absolutely zero sense for Apple to render anything at double resolution as well - unless they're doing FSAA, which would be dumb for text, so that's a silly thing to speculate about.
It seems either disingenuous or dumb to criticize Apple for not "making the resolution available" when you don't even understand what the feature is. No one can actually read text that's 10 pixels high when you have 200 ppi, so there's no point in offering that setting.
No, this is not font size settings, this is the resolution slider. It's in the same place as the resolution slider used to be without a HiDPI screen, and it says right on the left side in the fricking screenshots what the resolution is.
It seems really dumb or disingenuous to criticize Anand when you clearly don't have a clue what you're talking about.
I've played with a demo machine quite extensively and I'm pretty sure the slider does *not* set an actual "resolution".
It just adjusts the purely virtual base coordinate system to one of several possible settings.
The actual drawing is *not* done at those virtual "Point" resolutions, however, but is actually done at the *physical* resolution the virtual "Points" coordinate system gets mapped to. By well-behaved apps which use the provided "Scale" factor properly, that is. Those which ignore the scaling mechanism which has been there for a while and still insist on generating background pixel images themselves will result in pixelated output.
So you can set the equivalent of 1920*1200 from the application's point of view, but the actual rendering still happens at 2880*1800, just scaled differently (but mathematically *before* actual pixel rendering, not afterwards!).
So the scaling slider only adjusts how big the menu bar, other UI elements and window contents are being drawn, but not the rendering quality, which always remains at maximum (again with well-behaved apps).
I've got the 15" 1680 x 1050 Hi-Res Matte screen and my first concern was going from that to the 1440 x 900 equivalent scaling. Glad to see Apple has the option to run at a 1680 x 1050 equivalent if I so choose. I haven't decided if I'll get it or not, but I'm definitely going to check one out in person.
The simple fact is if 90% of the things we do on the laptop cannot use a retina displays full Resolution, Apple should have made this laptop 1900 x 1200 native resolution. What most people do on a laptop is read and write text. I would rather do that at the native resolution than at a downsampled resolution.
@The simple fact is if 90% of the things we do on the laptop cannot use a retina displays full Resolution,
You seem to misunderstand one of the following terms (resolution | scaling | pixel density). I'm not sure which of these is tripping you up. I'm also not sure what you mean by "things we do on the laptop". If you you are referring to correspondence (e-mail, social web, chat), productivity (word processing, spreadsheet, form based data), and recreation/leisure (web browsing, gaming, streaming media), then 100% of your apps can take advantage of the retina display.
Text, graphics, and window elements will be the same size as on your old MBP, but double the resolution. Applications without high resolution resources get pixel doubled until an update is available.
he's wanting to work at 1920 resolution (an option offered by apple).
at this resolution, there's not a 1:1 correspondence with the 2880x1800 screen, and everything will receive some resampling unless it's an element that can be drawn using apple's new retina code.
even with the new screen rendering options, you have to buy all new applications to get the versions working with retina (if they even release this year, even adobe's going to take a while)
so for now, 90% of things can't use the retina advantage.
And things like web graphics will always need to be resampled when displaying on a retina if you're in any resolution other than 1440x900.
I totally agree with the dream setup your mentioned (lower-end 15" retina plus 512GB flash). Apple's decision to opt-out the storage upgrade option really disappointed me. I know there are some workarounds like hooking up a thunderbolt external HDD for file storage. However, that means I have to carry it around and hook it up which is extremely inconvenient. After all, there's no point to get this MBP over the original 15" if I have to lug around a drive with it, right?
It looks like you're using AA when running at 2880x1800 (you have AA enabled in the screenshot).
When running at the native resolution, there isn't any aliasing to contend with, so you don't need it. Disabling AA will likely give you back 15-30% of your performance.
Can the retina display be reduced to 1400x900 in system preferences to lower the strain on the 650m for 3d programs like Maya, modo, and Cinema 4D? I'm fine with working at the lower resolution while in these programs, then resetting to retina for everything else. But the mention of a performance hit scares me as 3D programs already tax the graphical capabilities of mobile gpus at lower resolutions.
Will buy a "Ivy Bridge" laptop for sure. This MAC is a very good option I'll think. Have everything I wished for, a size 14 would have been perfect! Currently using an old Sony Z 13" and don't see anyone in PC biz that can compete with Apple atm, poor SONY that neglects software updates, fills computers with BLOATware and uses stupid RAID SDD soloutions in new Z series.
I really don't see where any of these negative comments are coming from. Not that I'm an apple fanboy or anything, I build and repair PC's for a living, and it would be a nightmare if someone brought this thing to me, but thats to be expected. But seeing all these comments such as "there is no usability improvement by increasing resolution" or "why have a discrete GPU for anything but gaming" or "1gb of vram is not enough!"
I also do a lot of enthusiast photography, which apple is a VERY popular name when it comes to this, the increased resolution is more important than almost anything else. The same goes with video editing and graphic artists. These are major fields for apple computers, and this is a huge improvement. After using the new iPad, there is no way I can go back to viewing and editing photos on a sub par screen with lower resolution, this laptop is going to be a godsend.
Also, the discrete GPU is a huge impact on a variety of applications other than gaming. Have you ever tried to switch between integrated and discrete graphics on laptops that have hybrid? Or even more specific, the 2008 MBP with the 9600GT? It slows down photoshop, aperature, all video editing software, and even picasa! Also with macs, its not like a pc where it needs the firmware modifications of a professional grade videocard that costs 10x the amount. A consumer grade videocard will give a mac every bit of 3D rendering capabilites in all CAD programs as a Quadro, minus the boost in video ram. And as some other posters pointed out, the 1gb of ram in this card is NOT the bottleneck in any gaming or consumer use, its the slow speed of the 650m, and of course there is NO WAY to cool a higher end card in this form factor without drastically increasing fan noise, which apple will not accept.
By the way, on my desktop Diablo 3 runs at 20-30 fps most of the time, and only goes to 60 in certian scenarios, and im using a 4.8ghz sandy bridge i7 and a GTX 680.. its completely playable, its part of the games engine. it only gets unplayable and glitchtchy when jumping between 10-30 back and fourth on a lower end comptuer. (which this laptop would be considered a low end gaming desktop in that respect) Im sure playing at full resolution would be plenty playable, just quite as smooth as 1440x900
As a photographer, this laptop may make me leave apple.
I need more screen real estate than 1440x900. there's just too many palette's/ dialogs/ programs I want to work with.
And even apple says right in the mbp's screen resolution dialog that using 1920x1200 on this laptop isn't a good idea.
Looking at the images here, I can see exactly what I was worried about. At 1920x1200 web pages look like crap because all the graphics are being resampled to display on a 2880x1800 screen.
Good one apple. take away a true 1920x1200 resolution laptop.
and getting rid of the 17"? really? yeah... good one.
I had C2D and Sandy Bridge MBPs. Under heavy CPU load for 5-10 minutes they noticeably throttle the CPU which defeats the purpose of getting a fast CPU. They hit the Tjunction temp very quickly. Has thermal dissipation improved in this year's MBPs?
As raised by other readers in these comments (@The Von Matrices, @B3an, @EnzonFX, et al.), would you (Anand, or anyone else that might know) recommend waiting a bit before purchasing an Apple Cinema Display because it's likely Apple will be providing a much higher resolution ("retina" or otherwise) on their Thunderbolt 27" display, or will it be at least the better part of a year or more before they could possibly do anything approaching this resolution in a 27" format?
I played with one today, it's gorgeous of course. Thoughts:
a) I have a sweet early 2012 Macbook Pro already, with 240 GB SSD and a 1 TB hard drive where the CD-ROM drive went. This is sweet. I spent 2 months on a road a year (in the U.S. as I am usually in Japan) and having that extra space for movies, iTunes, Dropbox files coming in is good, switching to 768 GB SSD only would be hard on me.
b) I am nervous about slow-to-update apps. It would not be unlike Adobe to say, Upgrade to CS6.1 to get the new features like support for Apple's new resolution, and I just upgraded. That said, I'd likely run it in smallest-test mode since it's not that hard to read and I would not notice any resolution issues that way.
c) I am really looking at a holistic upgrade. I use my laptop heavily, doing hardcore business stuff with it, but it's attached to an Apple 27 inch display much of the time. Give me a new Thunderbolt display with higher res, USB 3.0 and all that, and we can talk.
d) Apple is likely, hopefully, at the beginning of a major roll out of new stuff with higher resolution, I want to see where that goes a bit.
I won't be upgrading, but as an Apple shareholder I recommend that all of you do. ^_-
I always thought it was me (or my card) before, not the slot in the MBP Early 2011 that was a problem. I wonder what the workaround is supposed to be - repeated insertion, or applying pressure?
One question: I noticed the Kensington slot is gone from the sides - has it moved to the back, or is it just gone?
I did some experiments on my Mac Pro with ATI Radeon HD 5870 1 GB and Apple 30" Cinema display (2560x1600 native) running Mac OS X 10.7.4.
I was able to use SwitchResX to create and use scaling resolutions up to 4096 wide or 4096 tall but the screen buffer was limited to under 32 MB which means the highest 16:10 resolution I could create was 3584 x 2240. I guess this is a limitation in the Radeon driver since the MacBook Pro with Retina display can go higher. I wonder if it can go higher than 3840 x 2400?
I was able to use the Mac with a desktop that was 3584 x 2240 which is scaled down to the native resolution of 2560 x 1600. I was also able to play games at 3584 x 2240 (a cheap or expensive anti aliasing method but it makes the game cursor even tinier). Screen shots were also 3584 x 2240.
SwitchResX creates the resolutions by editing a file in /System/Library/Displays/Overrides corresponding to the display. It also adds contextual and menu bar menus for changing screen resolution.
I used Quartz Debug.app from Xcode 4 to enable HiDPI display modes. These HiDPI modes are one quarter (one half vertical and horizontal) of any resolution that is greater than 1600 wide. I believe these modes are what the MacBook Pro with Retina display uses. I wonder if these SwitchResX menus can be used on the MacBook Pro with Retina display to select modes that are not HiDPI?
I tried the various HiDPI modes including 1792 x 1120. The text does seem smoother than if I used a non-HiDPI version of the 1792 x 1120 resolution. Of course the best HiDPI mode for the 30" Cinema Display would be 1280 x 800 but it doesn't leave a lot of screen real-estate. 2560 does not divide by 1.5 evenly so you can't get a mode that would look exactly like the 1920 x 1200 HiDPI mode looks on the 2880 x 1800 Retina display without some extra work.
It seems that apple has taken away some ports, including the audio line in that the previous MBP had. Does the headphone port also support audio line in?
I was on a conference call using Skype with my iPhone headset plugged into the single jack, and my colleague seated nearby complained that I was making scratchy noises when I moved the mic around and it touched my face; he could see and hear it happening.
Bottom line: MPBr display is overkill for business app use, but looks better than other physically larger external monitors.
Monitor 1: MBPr set to "best for Retina" Monitor 2: LG Flatron M2380D 1920x1058 connected via Mini DisplayPort to VGA (yeah, MDP to HDMI SUCKED, had bad underscan at default setting, and terrible pixelation/aliasing (? I don't know the right term, several commenters here will blast me for it, whatever) making the fonts look bad at any underscan adjustment level) Monitor 3: NEX MultiSync 90GX2 1280x1024 connected via Mini DisplayPort to DVI
MBPr display is really only usable at the "best" setting; scaled to "larger text" reduces real estate too much; scaled to "more space" makes stuff too tiny even for my good eyesight. And BTW, Safari (for example), which is supposed to be optimized for MBPr, behaves exactly the same as every other app I put on that monitor, whether it's a Mac OS app or a Windows app like Outlook running via Parallels. Definitely looks crisp, though.
LG M2380D is worst of the three, although it obviously has a large physical presence. Text and images simply not crisp, but at least using VGA it's usable. I just don't get why HDMI didn't work, whether from HDMI to HDMI, or MDP to HDMI (I tried both).
NEC 90GX2 is pretty good. Glossy like the MBPr, which I like, and pretty crisp.
Note: Weird that the LG's resolution is 1920x1058; I think the monitor's is actually 1920x1080. I was going to say that maybe the missing 22 pixels were taken up by the Mac menubar (as I got the resolution figure from the Windows Display control panel in Parallels), BUT, under that hypothesis, the NEC should also have had a deduction (since it too has a menubar thanks to Multimon). OTOH, maybe the LG registers the loss because that is the primary display in the Mac OS Display System Preferences, while the NEC is not, and maybe Multimon's second menubar on the NEC is seen differently by Windows. I'm way past my ability to provide any more than conjecture, however. :-]
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
188 Comments
Back to Article
piroroadkill - Tuesday, June 12, 2012 - link
GT650, OK..But only 1GB VRAM? At this resolution? I'd put at least 2 in there with that kind of resolution..
piroroadkill - Tuesday, June 12, 2012 - link
Read through it a bit more. Damn, it's ghetto that it doesn't expose the correct resolution by default, and that text is ballsed up in non-"retina" aware applications. Christ I hate the term "retina" to describe displays, it drives me up the wall.EnzoFX - Tuesday, June 12, 2012 - link
The article clearly explains why the resolutions isn't exposed to the user. It seems to be the natural choice. Also, who cares that it's called Retina? It's just a name, and the marketing is correct assuming you're looking at it at the right distance =P.desta23 - Tuesday, June 12, 2012 - link
I guess using that logic -- all screens are "Retina" :Dmcnabney - Tuesday, June 12, 2012 - link
That is correct. So far the three Retina displays have widely different resolutions and DPIs.skogul - Tuesday, June 12, 2012 - link
Since the user interacts with these three devices at different distances, this makes sense.The idea is that, assuming the device is a normal distance from you, and you have normal eyesight, the resolution is sufficiently high that you won't distinguish pixels.
surt - Wednesday, June 13, 2012 - link
The normal distance for a laptop isn't enough for this to meet that hurdle. They've thus changed retina from reasonably truthful marketing gimmick to misleading/false advertising.OrionAntares - Wednesday, June 13, 2012 - link
That would make my three year old phone "retina" and my current monitor and laptop monitor "retina" as well...ImSpartacus - Tuesday, June 12, 2012 - link
Distance matters. The PPI and pixel count are only one part of the equation.http://goo.gl/dNkj6
I made a table for reading the minimum rows of pixels required for the iPhone 4's "Retina" effect (at 10") from a specified distance.
As you can read from the table, a 16:10 15.4" screen needs 1922 rows at 14" and 1682 rows at 16" away. Since the "Retina" MBP has 1800 rows (from 2880x1800), I say that it's "Retina" from 15" away.
You can also find that a cheap 22" 1080p monitor is "Retina" if you're about 33" away. But the moment you get closer than that, your 20/20 vision can discern individual pixels.
It's a fun table to play with. If you care about the technical junk, Apple had a slide about it in the iPad 3 reveal:
http://www.blogcdn.com/www.engadget.com/media/2012...
lowlymarine - Tuesday, June 12, 2012 - link
This whole thing seems a bit wasteful here, to be honest. For example, I decided to measure how far I sit from my 15.4" MBP during normal usage and came out with ~26 inches. So according to your chart, my 1680x1050 screen is ALREADY a "retina" display! And certainly I can't discern individual pixels on it under normal circumstances. Heck, I sit about 30 inches from my 27" 2560x1440 display so it comes pretty close.I'm frankly much more interested in the IPS aspects, especially viewing angles. The MBP already had one of the best panels available in a notebook from color and contrast perspectives, so further improvements there will just be icing.
I'm also interested in how they plan to keep a GT 650M and 45W IVB quad-core cool in a 0.76" chassis, when the current SNB models are 1" thick, sound like 747s under load, and still run up to nearly 90C.
Zodd - Tuesday, June 12, 2012 - link
Only apple screens are retina, since its their trademark. Same as their logicboard(motherboard) or magic mouse(mouse).enderwiggin21 - Tuesday, June 12, 2012 - link
Apple screens are Retina (tm) (Uppercase "R"). Any screen, including Apple's, can be "retina" (lowercase). The term existed prior to Apple's co-opting of it.AllAboutMac510 - Sunday, July 1, 2012 - link
umm no. I see the pixels on my laptop and phone and iPod Touch (even though it is the 4th gen 'Retina') If you see the pixels, it is NOT Retina...UnamusedPunk - Sunday, June 17, 2012 - link
Where does the article "clearly explain" why native resolution isn't exposed to the user? The writer only hints at the fact that portal 2's text dialog window is very difficult to read at native resolution. Quit overstating.ImSpartacus - Tuesday, June 12, 2012 - link
Two months after release and we will have apps to completely expose the correct resolution to the system. I'm sure there will even be hacks that let you banish the word "retina" from system dialogs.BSMonitor - Tuesday, June 12, 2012 - link
Apparently less than it does to curse Christ over gaming resolutions............EPIC FAIL
pcgeek101 - Tuesday, June 12, 2012 - link
I'm with you, piroroadkill -- the "retina" term to describe high resolution Apple displays drives me up a wall, too. So does the use of non-standard SSDs, and other standard parts.Tegeril - Tuesday, June 12, 2012 - link
If they used a standard SSD, they wouldn't be shipping in this form factor. Deal with it, buy another company's product. How are they hurting you?qooboot - Tuesday, June 12, 2012 - link
> How are they hurting you?> Deal with it, buy another company's product.
Well fuck, he (and I) probably will. Just because there are other options out there doesn't mean that we can't voice our opinions on what Apple releases.
kristoffe - Saturday, June 16, 2012 - link
they could easily fit the 1.8 to 2.5 difference. teardowns show that if you know anything about EE. Deal with it... probably not. Typical fanboy response instead of looking into the critical analysis.Some people like not getting wallet raped for things they could have the higher level option of repairing. Aside from that, who the hell is going to benefit from a 2x bumped resolution in a form factor of the hyp of a triangle that at max is 15". people who read 3-4 tiny pdf pages? no, maybe 3d artists in wireframe mode? probably. photographers? lol, no. you can't edit on such a small, cramped screen, you might as well just use the iPad3, which is great for pdf reading and a much better form factor. overall the mbp 2012 q2 is a logical fail, and wallet rape for useless features. The lightness is probably a big win, but soldered in memory? why not just go ipad? this is an asus transformer with optional keyboard attachment as one paired with a screen that is too small for normal functionality...
with a tiny LIF 1.8" SSD that isn't even standardized. That would ALSO help techies and Apple themselves. How Apple would benefit? Production cost with standard parts would cut 15% off the pricetag, and repairs would be a snap.
puggsly - Tuesday, July 3, 2012 - link
People trash on Apple as not innovating because they use standard off the shelf parts and people trash on them for custom designed parts, they just can't with some people.But more to your thoughts that there is no reason for this.....look at what it brought the last time Apple went crazy nonstandard with the Air. Apple created a new class of laptop that only now, 5 years later, are vendors who use "standard" parts catching up to.
solipsism - Thursday, June 14, 2012 - link
The 2880x1800 display isn't standard either but I haven't read where you are up in arms about that. Hell, their SSD card is more standard in that it uses NAND, a Samsung controller, and a mSATA connector. I see no reason why anyone with the desire and know how can't make their own SSD cards to go into this machine.owned66 - Tuesday, June 12, 2012 - link
1GB is more than enoughmy pc uses about 150mb of vram at 2650x1600
so its not a problem
shadvich - Tuesday, June 12, 2012 - link
kill me, it hurts to liveLatzara - Tuesday, June 12, 2012 - link
That's true - for displaying your desktop - but displaying a 3D scene in modern games with loads of textures pushed into it at that resolution -- that just won't cut it -- But anyway, it's not a gaming rig so it's not expected of this productsigmatau - Tuesday, June 12, 2012 - link
Of course it's not for gaming. It's a Mac.Lonyo - Tuesday, June 12, 2012 - link
Then why put in a discrete GPU at all?Jumangi - Tuesday, June 12, 2012 - link
Looks nice on the spec sheet? Never understood the reason for a discrete GPU on Mac laptop myself.addabox - Tuesday, June 12, 2012 - link
Because there are possible uses for a laptop other than "display the desktop" and "run video games at high resolutions"?seapeople - Tuesday, June 12, 2012 - link
No, it's either Microsoft Word or Crysis at max settings on full resolution.There is no point in anything in between.
fragger505 - Friday, June 22, 2012 - link
Hardware accelerated video en/decoding?Tegeril - Tuesday, June 12, 2012 - link
Yeah there are certainly no computational tasks that work well on GPUs besides video games.kristoffe - Saturday, June 16, 2012 - link
lol. you would NEVER fold on a mobile processor, what the *** is the point of that? cuda or opencl on a restricted <560, come on get with the program. small desktop, 680 or even 690, 480s on sale in sli, and a quick z77 with something paired to it like a 6-12 core x2 threaded, you wouldn't use this eye garbage to do any serious work with it. It would pop at 90c and it's diode sensor would fail easily over intensive work. troll.ThreeDee912 - Wednesday, June 13, 2012 - link
Programs like Final Cut X and Aperture heavily rely on the GPU for video decoding and rendering, and generating realtime image effects and animations using OpenCL.In particular, the system requirements for Final Cut X specify OpenCL-capable graphics. People managed to hack Final Cut X onto old Macs without OpenCL graphics cards, which caused a fallback to CPU rendering. Instead of being able to layer on multiple effects while playing back an HD video in realtime, a single effect brought the program to a crawl.
seapeople - Tuesday, June 12, 2012 - link
I think it's safe to say that you wouldn't be doing much modern 3D gaming at 2880x1800 in this thing regardless of whether or not you have 2GB of vram. The extra resolution is there for displaying your desktop, while for gaming you've got pixel doubling at 1440x900.kristoffe - Saturday, June 16, 2012 - link
you dont game much or do 3d /editing. 2gb+ is the point of post 1080P work. do your homework before posting.tipoo - Tuesday, June 12, 2012 - link
Most games would not run at native retina resolution on the 650m regardless of the video RAM, I don't think 1GB will be the bottleneck. You'd be playing anything more demanding than Diablo at lower resolution.mcnabney - Tuesday, June 12, 2012 - link
I was also amused that averaging 20fps is now considered playable. I guess when you have to release a glowing review a few goalposts must be moved.desta23 - Tuesday, June 12, 2012 - link
I thought the same thing.Monkeysweat - Tuesday, June 12, 2012 - link
I've read a few reviews on Diablo III where they used lower end GPU's and said that due to the style of the game that 20's were acceptable for gameplay,, obviously if it were a FPS, then I would defintely say Anand is on drugs,, but because I've seen it said in other places, I let it slide.MBP isn't built for heavy duty gaming anyways, meant for professional work mostly.
cknobman - Tuesday, June 12, 2012 - link
I am playing Diablo 3 now and can say that if your just running around town 20fps is fine.If your in a battle it is nowhere near ok.
Anand is being too kind and it gives the impression that he is trying to make the Macbook Pro look better than it really is for gaming.
seanleeforever - Tuesday, June 12, 2012 - link
professional work computer that doesn't have Ethernet port and fire-wire port, okay.ex2bot - Tuesday, June 12, 2012 - link
Inexpensive adaptors announced (and Ethernet now available IIRC) run Ethernet and / or FireWire through Thunderbolt.lowlymarine - Tuesday, June 12, 2012 - link
"MBP isn't built for heavy duty gaming anyways, meant for professional work mostly."Then why does it use a consumer-grade GPU? Certainly were that the case, a Quadro or FireGL would make more sense, like ThinkPads and Precisions use?
ouzelum - Friday, June 15, 2012 - link
Form factor, for one. The machines with those cards are enormous.Also, the benefit of those cards is only noticed in very specific applications like CAD and serious motion-graphics. I would think this audience constitutes a very small percentage of those interested in purchasing this laptop.
tayb - Tuesday, June 12, 2012 - link
It depends on the type of game and then the game itself. Diablo isn't a first person shooter, it plays perfectly fine in most scenes at 20fps. There just isn't much happening on the screen that needs to be drawn more than 20 times per second.Anand isn't the first person to cite 20 frames as playable on Diablo III.
ltcommanderdata - Tuesday, June 12, 2012 - link
It's 20fps in the heaviest scenes in Diablo III. Meaning most other scenes are higher and the actual average fps you'll experience across the whole Diablo III is more than 20 fps.Jumangi - Tuesday, June 12, 2012 - link
No its 20's early in the game. Anybody who has actually played the game through knows the battles become bigger with many more enemies and effects going on. The first Act does not give a proper indication of whats acceptable. Anand shouldn't be saying its "ok" for somnething like D III when in the long run it won't be.tipoo - Tuesday, June 12, 2012 - link
It's not a twitchy shooter, I think casual gamers would be fine with 20 (so long as the dips weren't too big or frequent). Anyways you could run it at a lower res and get a much better frame rate.pcgeek101 - Tuesday, June 12, 2012 - link
Anything less than 30fps is unplayable, and even 30fps is questionable ...seapeople - Tuesday, June 12, 2012 - link
You've gotta be kidding me. OBVIOUSLY Anand is not saying that you should buy a Retina MBP so you can play Diablo at 2880x1800. It's a curiosity, an academic discussion. "Hehe, Diablo loads at 2880x1800, and I can run around town! Cool!"Would you have preferred if Anand was like "I tried playing Diablo III at 2880x1800 on a 0.7" thick laptop, but it's nowhere near playable in intense battle scenes in the later levels! I am extremely disappointed in Apple! This is an absolute shame! This computer is in no way a replacement for an equivalently priced Crossfire or SLI equipped desktop running dual 1920x1200 monitors. Therefore I say Apple has ultimately failed in their quest to bring premier ultra high resolution gaming to the thin and portable notebook format."
EnerJi - Tuesday, June 12, 2012 - link
I wish we could upvote comments on this site. This comment deserves a bunch of upvotes.steven75 - Wednesday, June 13, 2012 - link
The guy's name gives away his agenda.Heathmoor - Saturday, June 16, 2012 - link
You know that sound film movies are 24 fps, right?http://en.wikipedia.org/wiki/Motion_picture#Techno...
DKazzed - Tuesday, June 12, 2012 - link
20fps *at the most stressful scene*geniekid - Tuesday, June 12, 2012 - link
The game becomes more taxing on graphics as you advance through acts and difficulty. Based on anecdotal evidence (noted in the Update section of http://www.anandtech.com/Show/Index/5865?cPage=2&a... ), later acts and difficulties are more taxing.geniekid - Tuesday, June 12, 2012 - link
Come on Anand and company...stop goofing around and start playing more Diablo. We need more D3 benchmarks in later acts and/or difficulties!scifiballer24 - Friday, August 10, 2012 - link
Right... I'm sure you know more than Apple engineers who designed this thing. If they found that 1GB is plenty for great performance, what makes you think you know better?It probably comes down to the same thing it is every time. Apple makes the software and it runs a ton better than what you'd expect just looking at the numbers.
The Von Matrices - Tuesday, June 12, 2012 - link
For those who use a laptop as their primary computing device and dock it when at a desk, you have a dilemma with this laptop. With the integrated display having a higher resolution than any external display, it seems counter intuitive that when you're at a desk you would connect the laptop to a dock and use a lower resolution external display. Are the currently available external displays still better than the integrated high-resolution display in this laptop?B3an - Tuesday, June 12, 2012 - link
Well the MacBook Pro scales everything. Because otherwise 2880x1800 on a 15" screen will make things WAY too small, you could never read the text here for instance without getting extremely close to the display, it's not at all practical. So everything on the MacBook Pro is just scaled larger (giving you less work space), so you'll still get more stuff shown to you on a 30" 2560x1600 desktop display for instance - you'll have more work area.And obviously high-end desktop displays will have things like much higher colour gamut, more accurate colours, higher brightness and so on.
EnzoFX - Tuesday, June 12, 2012 - link
Even if there were, it is besides the point. At most you effectively have a 1920x1200 "res" with the laptop, in a desktop environment. So you have every good reason to connect it to an external display as before.gevorg - Tuesday, June 12, 2012 - link
Probably not, but a 27" or 30" is heck of a lot more productive than 15" display.JeffFlanagan - Tuesday, June 12, 2012 - link
Your "dock" just needs to be a lens in front of the screen to size it up to at least 22"Heathmoor - Saturday, June 16, 2012 - link
Or placing the laptop on a platform 10 inches away of your nose and the keyboard underneath the platform. At least this way you could save in the electricity bill.gorash - Tuesday, June 12, 2012 - link
Is it using the IGZO screen?tipoo - Tuesday, June 12, 2012 - link
ipsDeath666Angel - Tuesday, June 12, 2012 - link
I'm pretty sure that IGZO doesn't have anything to do with the pixel matrix. IGZO refers to the transistors that drive the panel, not the crystals that form the picture. Right now IGZO was shown with *VA and OLED technology, but I don't see any reason why TN and IPS shouldn't also be used in the future.If someone knows more about this, please correct me. :D
Bob-o - Tuesday, June 12, 2012 - link
> At 1440 x 900 you don't get any increase in desktop resolution compared> to a standard 15-inch MacBook Pro, but everything is ridiculously crisp.
This word, resolution, I do not think it means what you think it means.
You ARE getting an increase in resolution, that's why it looks crisp. You are NOT getting any increase in desktop SIZE. Your glyphs are the same size (in mm or whatever) but are being RESOLVED at a greater PPI. They are simply faking out applications with an artificial screen size. . .
ImSpartacus - Tuesday, June 12, 2012 - link
I know. I hate that as well, but it's become industry convention to use "resolution" to describe pixel count AND the size of UI elements. I feel your pain.owned66 - Tuesday, June 12, 2012 - link
i think what he meant was that desktop size will still be the same because everything is scaled up when ur at a higher rez in osxfor example im on my hp now and i can fit about 20 icons horizontally if my rez magically became 1080p the icons would looks smaller and thus more icons can fit
desktop size wont change things will only look more crisp
Heathmoor - Saturday, June 16, 2012 - link
I wonder what definition of resolution you are applying. Here you are the definition of image resolution in Wikipedia:- Image resolution is an umbrella term that describes the detail an image holds. The term applies to raster digital images, film images, and other types of images. Higher resolution means more image detail.
http://en.wikipedia.org/wiki/Image_resolution
According to this definition Anand is right. You don't get any more detail by replacing a pixel by four with exactly the same information. For instance, a black screen has null resolution no matter how many pixels and close they are because you can't resolve any line pair in such screen.
A similar criterion applies to:
- Optical resolution, the capability of an optical system to distinguish, find, or record details
- Resolution (mass spectrometry) the ability to distinguish peaks in a mass spectrum
- Angular resolution, the capability of an optical or other sensor to discern small objects
- Spectral resolution, the capability of an optical system to distinguish different frequencies
Sensor resolution, the smallest change a sensor can detect in the quantity that it is measuring
http://en.wikipedia.org/wiki/Resolution
Impulses - Tuesday, June 12, 2012 - link
Rendering off screen and downscaling is a clever workaround to male the new display more versatile, and the fact that they went IPS is just lovely. Why did they not update any of the Airs tho?? The pricd drop is nice but a 1080p (or x1200!) IPS with a $100 price hike would've been just as swell.How much work is involved in updating 3rd party apps for the new display?
ananduser - Tuesday, June 12, 2012 - link
Clever it might be but it's not resolution independence. They are just workarounds. And Apple cannot expect the entire world to code 2 versions of apps/sites just to appease their niche high-res laptop.Tegeril - Tuesday, June 12, 2012 - link
Judging by the rapidly slipping delivery dates, I don't think the device is niche at all for app developers.tuxRoller - Tuesday, June 12, 2012 - link
Offscreen rendering is something every compositing desktop does.That's where you perform the porter-duff texture compositing.
bhtooefr - Tuesday, June 12, 2012 - link
...almost like Portal 2 is rendering a 2880x1800 image to a 1440x900 surface presented by the HiDPI support, and then HiDPI is scaling back up to 2880x1800.That would be about the only way to explain the pixelation and the unreadable console.
roddyp - Tuesday, June 12, 2012 - link
Indeed. The Portal screengrab showing the console is at 3840 x 2400 resolution, while the diablo ones are at 2880x1800. There must be some upscaling going on...foxyshadis - Tuesday, June 12, 2012 - link
I think he said he was playing at 1920x1200, so that kind of makes sense. That's a little too much mangling to be anything other than a total bug, though, it'd have to be rendered at something like 720x480 to be that bad.desta23 - Tuesday, June 12, 2012 - link
Seems to me that this machine would have been much better off with a 1920x1200 IPS panel that it was actually capable of pushing as opposed to a 2880x1800 panel that the included 650m can barely handle...That and not having the RAM soldered to the circuit board might have made this a decent upgrade for those of us with older MBP's...
As is you get a screen that looks really nice but the computer is too underpowered to do much with...
samemodel - Tuesday, June 12, 2012 - link
Seems it performs excellently!vladi013 - Tuesday, June 12, 2012 - link
I don't see how average user will benefit from increased DPI. I can tell how will they benefit from viewing angles, increased contrast and color accuracy and more screen estate but not from more dense DPI.All this "my resolution is better than yours" is getting out of the hand, kind of like CPUs in Android phones; quad core, hexa core, etc. Completely irrelevant its the software that is either efficient or not.
Also reminds me when Galaxy II made a debut and everyone was praising the screen and I asked myself did they even see the screen? The screen can't reproduce pure white! Everything is tinted in either blue or yellow depending on the viewing angle. Yet the screen got praised all the way to heavens. Why? Because of the resolution.
If this Retina came as standard and not at $500 premium I will be all for it, but as an option it's not worth it.
StormyParis - Tuesday, June 12, 2012 - link
Actually, the reason I love AMOLED screens is 1- the pure blacks, 2- the excellent contrast, and 3- that they don't act as a lamp when reading in bed at night. I don't care much about color fidelity on a phone. Resolution is fine too, but that's secondary.themossie - Tuesday, June 12, 2012 - link
What's the point of higher resolution, especially when you can't use the native resolution? Dot pitch is gone! You won't see individual pixels at a normal distance.This is truly impressive - a laptop screen which approaches the dpi of my phone.
Does it improve usability directly? No.
Does it enable a near-flawless screen experience? Yes.
PC makers will bring you a screen like this without a $500 premium, just wait around a bit...
solipsism - Tuesday, June 12, 2012 - link
"PC makers will bring you a screen like this without a $500 premium, just wait around a bit..."Because vendors have been right on Apple's ass to offer the same or better resolutions in IPS displays after they introduced the iPhone 4 and IPad (3)¡
pcgeek101 - Tuesday, June 12, 2012 - link
It pains me to say this, but PC makers will never get it right. They've had a hell of a lot of time to get it right, but they fall flat on their faces. They constantly release "ultrabooks" with 1366x768 resolutions, which is an absolute, fucking JOKE. What year is this? 2001? I mean, seriously?Death666Angel - Tuesday, June 12, 2012 - link
SGS2 had a mediocre resolution when it debuted. Many smartphones of the time had 960x540 resolution and even 720p phones were announced then. The reason the screen was praised was for its increased sub-pixel resolution compared to previous OLED smartphone devices and for being OLED which a lot of people like.seapeople - Tuesday, June 12, 2012 - link
You realize that $500 premium is actually a $400 premium, and includes a good quality Samsung 256GB solid state drive, right? Not to mention that the screen is IPS with markedly better contrast ratios and viewing angles than virtually anything else out there.terrinecold - Friday, June 15, 2012 - link
The average user will benefit because in most application (those that don't do special text rendering but take advantage of the OSX apis to render for them) the text will be rendered much nicer. This is the same as what happened with the iPhone and iPad retina display, in most case with no work from the developer the application is much crisper for text rendering. Then with some work on the graphical resources (by including 4x resolution resources with a special naming convention but with no additional line of code) all the graphics look much sharper.The constant between all the apple Retina displays is not the marketing behind it is the way the APIs handle it. Basically for most high level API the resolution is standard but things (lines, text) are much crisper, then when you want you can specifically render things at double resolution using a new set of APIs which were introduced with the iPhone 4 (and which are the same on the Mac)
Henk Poley - Tuesday, June 12, 2012 - link
I wonder how well Microsoft Windows runs on these retina displays.Spivonious - Tuesday, June 12, 2012 - link
Probably great. Just set the DPI to 220. Of course, some apps won't scale right and will look funky (some text is tiny, some text runs off screen), but most apps are fine. I've been running a 130dpi setup for a few years now and wouldn't go back to 96dpi. It's amazing how different fonts look at higher dpis.pcgeek101 - Tuesday, June 12, 2012 - link
I would love to run Windows 7 or 8 on a display like this, but I don't want the Apple keyboard or proprietary SSD (and other hardware). I wish PC makers would get with the program.laytoncy - Tuesday, June 12, 2012 - link
The display is great. The machine looks great. Finally 16 GB's of RAM. All SSD's for storage. Check, Check, Check...but why only a 650M? I'm guessing they just can't cool a 680M or even a 660M? I just don't get it. They don't seem to cut too many corners but they never give you options (at least good ones for gaming) when it comes to graphics...Zensen - Tuesday, June 12, 2012 - link
I hope more manufacturers start offering ssd and the 16gb ram is nice. I do think the display is overkill on a portable unit but whatever, I'm sure it'll many people fine.As for the graphics, sounds about right... it'll play angry birds, ninja fruit just fine.
tipoo - Tuesday, June 12, 2012 - link
Probably heat or cost or both. The 650 is a good improvement from the 6770 though. It's not made to be a gaming machine although it can clearly handle a decent level of gaming, so for most the 650 is fine.desta23 - Tuesday, June 12, 2012 - link
It's running Diablo 3 at 27fps in that picture -- one of the least demanding new titles ever released (exaggerating a bit here) -- hardly a "decent level of gaming" considering it isn't even rendering any monsters...Apple's engineers clearly never heard the story of Goldilocks and the three bears and finding what was "just right" in regards to screen resolution....
This upgrade amounts to paying $500 for a screen which the hardware is too slow to run and then losing the ability to upgrade your own RAM and HD...
I own a Macbook Pro now and would love (and can easily afford) a new one -- I'll pass...
thoughtsforthemind - Tuesday, June 12, 2012 - link
Did you read the article? The laptop is running Diablo at native resolution at 27 fps, or in other words, driving more pixels than any other laptop. Ever. Made. (that I'm aware of, anyone know better?)With the ability to turn down the resolution to a more normal 1920x1200 or even 1280x800 for more demanding games, and minimal artifacting going by other programs, it should be fine as a gaming unit. Not top of the line, but medium/high settings for 2012 games fine.
Now if they would release more games for Mac...
desta23 - Tuesday, June 12, 2012 - link
Did you read the article and actually play Diablo 3? If he's getting 27 fps doing nothing -- what is he going to get in Act 3 Siegebreaker with 40 monsters on the screen? So no -- it won't play Diablo 3 at that resolution...So your suggestion is turn down the resolution to make it work? Considering you'd have to run 1440x900 to avoid major pixel interpolation problems what was the point of spending all the extra money for that screen again?
The point is - Apple overshot. The hardware in the computer isn't up to snuff for 2880x1800. Most people running that resolution would have an SLI or Crossfire setup. The notebook should have came with a 1920x1200 IPS LCD panel - period.
KitsuneKnight - Tuesday, June 12, 2012 - link
You're making the mistake of assuming that Apple is targeting gamers with this system. Apple isn't (when do they ever?). At best, being able to game is a nice 'perk' (I've played all of Act 3- and most parts of other acts- on a system that would only pull low-20 fps when idle in Act 1... it was perfectly playable).If you're doing anything that involves text, using a system with a very high DPI will make it much easier to use for extended periods of time. In addition, if you did iOS development, you'd actually be able to do development for the iPad 3 without having to scale the crap out of the simulator.
Having very high DPI will also be wonderful for anyone that works in Photoshop (or similar programs). I don't know exactly how Lion's scaling will impact Photoshop, but I'm sure both Adobe and Apple will be working to make sure it works well.
There might also be more controls/functionality included with Mountain Lion. I wouldn't be surprised if Lion just has a very limited version of ML control/APIs.
ytpete - Tuesday, June 12, 2012 - link
>> You're making the mistake of assuming that Apple is targeting gamers with this system. Apple isn'tThen why did they show Diablo III during the keynote?
Also, I'm not at all convinced that high-DPI is automatically great for Photoshop. E.g. if you're making content for the web, you don't want to be viewing pictures at a totally different DPI from everyone else.
seapeople - Tuesday, June 12, 2012 - link
The screen is IPS. That in itself is worth a significant upgrade price. Maybe not $400, but with that you get a solid state drive as well.Forget the resolution upgrade, many people would consider this a worthwhile upgrade for the IPS screen and SSD alone.
Also, I wonder how 1920x1200 will look on this screen - the pixels are so close together that it might not be bad at all.
Conner_36 - Tuesday, June 12, 2012 - link
One hundred 650m's are cheeper than fifty 650's, twenty-five 660's and twenty-five 680's.NCM - Wednesday, June 13, 2012 - link
Why don't they give you good gaming options?Because it's called a MacBook Pro, not a MacBook Gamer.
Owls - Wednesday, June 13, 2012 - link
If it's a "PRO" why not put a GPU that is worthy of that name? Instead they stick an awful 650 in there and price it at $2200. Genius!_Marco - Tuesday, June 12, 2012 - link
Anand, did you maybe already have a chance to connect the retina MacBook Pro with a non-retina external display? Is that combination usable?Heathmoor - Saturday, June 16, 2012 - link
How many external Retina displays have you heard of?Gunbuster - Tuesday, June 12, 2012 - link
So if I'm reading this correctly they give you a 2880x1800 native resolution screen but then don't allow you to actually display the desktop at that resolution?Lets sell a car advertised as being able to reach 100mph but really electronically limit it to 50mph and then double the readout on the speedometer...
I wonder if the bootcamp driver imposes the same limits on windows?
Spivonious - Tuesday, June 12, 2012 - link
You need to read up on DPI. The computer wouldn't be useable at 2880x1800, 96dpi. You'd need a magnifying glass to read text.Gunbuster - Tuesday, June 12, 2012 - link
No I don't need to read up. Apple is saying no matter what my use case is that I cant have access to 2880x1800 native even if I don't care about their text size. What if I want to do medical imaging or photo editing...Sounds to me like they have an OS that cant scale text so they decided to do this pixel doubling BS.
Synaesthesia - Tuesday, June 12, 2012 - link
Of course you can look at medical images and edit photos while displaying at high resolution.The nice thing is you get decent sized UI elements and a 2880x1800 screen.
The scaling only applies to text and UI elements, NOT to images which you are looking at or editing.
enderwiggin21 - Tuesday, June 12, 2012 - link
Thanks blerg. That is an important distinction that could be spelled out better.gorash - Wednesday, June 13, 2012 - link
Umm, the images are also clearly scaling.ramb0 - Monday, June 25, 2012 - link
For apps that are tweaked to so they are "Retina aware" will display their video/images at full native resolution, while the text and UI elements are scaled so they are still readable.Apps like iMovie, Aperture, FCPX and Photoshop CS6 (free update is on it's way) will all support the native "Retina" resolution for the video/image you are editing.
For apps that have not been tweaked for "Retina aware", OS X will assume all aspects of the app need to be scaled so it is readable at the 1800 resolution.
Heathmoor - Saturday, June 16, 2012 - link
The example you've given is actually real, many car models have their max peed electronically limited (e.g., BMWs, Mercedes-Benz's, Audi's, even Tesla's). Anyway, it's likely Apple is intended to offer native resolution for the OS desktop in a future OS iteration.jonyah - Tuesday, June 12, 2012 - link
The closet competitor to this would be the Vaio Z. I'd like to see it compared to it's full hd screen. Comparing it to some ultrabooks isn't exactly the best comparison. The Z, though not considered an ultrabook, is lighter than this and just as, if not more powerful.FiReBReTHa - Tuesday, June 12, 2012 - link
Same thing I said a couple pages back!instead they are comparing 1368x768px "Cell phone" screens vs the 2880 beast.
I would like to see the Z ranked in the comparison!
BSMonitor - Tuesday, June 12, 2012 - link
It's not remotely reasonable to expect such high resolution gaming on an ultra-thin device. Not when we barely get these resolutions in a LOT of games in DUAL GTX680 mode on the desktop.desta23 - Tuesday, June 12, 2012 - link
Which is exactly why this "retina" display is such a joke in this notebook -- it doesn't have the power to run today's applications let alone tomorrow's at that resolution...Does anyone have the belief 1920x1200 is a low resolution in a 15" screen? Does anyone really believe the current Macbook PRO was too thick? Why couldn't Apple have used the additional space for a bigger battery and better cooling allowing better graphics on their "PRO" model?
Silenus - Tuesday, June 12, 2012 - link
No...the joke is actually that so many are overemphasizing "gaming" on a mac book pro. Gaming is not even remotely the target market for these notebooks. A 650m is PLENTY to run any non-gaming applications now and for years to come. ANY gaming performance it does have is purely incidental to it's main purpose.There are plenty of gaming notebooks out there folks. Macbook Pros are not and never were gaming notebooks. Please folks, season your expectations with some reason.
Owls - Wednesday, June 13, 2012 - link
So why did apple show D3 during the keynote? Or is this blind fanboyism talking?Heathmoor - Saturday, June 16, 2012 - link
The more versatile the computer, the greater market niche you can reach. Larger sales means lower costs per unit and larger profits. High end laptops won't disappear anytime soon, desktops used as domestic computers will, being replaced by laptops like this. The more powerful you make them the more people will ditch their current desktops and replace them by a top notch laptop, but you have to sell it in the first place.Death666Angel - Tuesday, June 12, 2012 - link
Last I checked Apple didn't cater to the gamer crowed with their notebooks and Office and Photoshop should run fine enough on that resolution.desta23 - Tuesday, June 12, 2012 - link
Last I checked it's called the Macbook "Pro" -- Professionals might have need for graphics horsepower in situations other than games. Any "pro" is going to want his machine running at the native resolution...Quite frankly - 2880x1800 on a 15" notebook with these specs is a joke.
rs2 - Tuesday, June 12, 2012 - link
There was a question posted here regarding the performance of the glossy display relative to a matte one. It has been removed, along with a number of comments pointing out that as a picture included in the article showed, glare was indeed an issue with this display, at least in contrast to what could be expected from a matte display. The picture which so perfectly highlighted the issue with glare on the glossy display also appears to have been removed and replaced with one that shows far less glare.Is anandtech in Apple's pocket now, or something? Are we not allowed to point out the issues with glossy displays and glare? Are images which highlight these issues being taken down at the request of Apple, and replaced with subjective "well the glare really doesn't bother me" glad-handing?
If you've got any journalistic integrity at all you'll restore the previous picture at least as a counterpoint to the discussion on glare, and explain why comments pointing out that a glossy display will produce more glare appear to have been removed.
rs2 - Tuesday, June 12, 2012 - link
Criticism retracted, the photo and comments are actually on a different but deceptively similar article: http://www.anandtech.com/show/5996/how-the-retina-.../facepalm
Heathmoor - Saturday, June 16, 2012 - link
Anyway, people would eventually discover that awful glare as soon as they went to an Apple store to have a look at the laptop. You can't hide certain things.meatyocre - Tuesday, June 12, 2012 - link
I'm so torn....sell my Mac Pro 1,1 & 13" MBP and streamline workflow to a single machine Retina MBP...It is looking pretty sweet.brentth - Tuesday, June 12, 2012 - link
Anand,Thanks for the late night on the MBPR. I appreciate it.
UltraWide - Tuesday, June 12, 2012 - link
Interesting implementation of the 2880x1800 panel. I was expecting it to be full 2880x1800 resolution for the desktop with icons and apps scaling to this while keeping the quadrupled usable surface area. This implementation gives you more nuance since you have 4x the rendering capacity, but it doesn't do anything for increasing the usable desktop area.I am guessing there will be an OSX update in the near future once the ecosystem matures more? or not?
Heathmoor - Saturday, June 16, 2012 - link
I guess you're guessing right.JeffFlanagan - Tuesday, June 12, 2012 - link
I'd instantly buy a Windows 7 laptop with these specs, and might even jump to a Mac if my work wasn't Windows-specific.fmcjw - Tuesday, June 12, 2012 - link
Well, this makes the VAIO S15 look like a bargain. For $1000 you get Ivy Bridge, matte FullHD IPS, mSATA SSD capability, under 2kg weight, and standardized RAM/HDD/Optical bay for 2nd HDD.Seems people are buying it for the 2880x1800 panel and running it at 1920x1200 anyway. Then there's the issue of the humongous 96WH battery, and the charging times and power brick size issues for this "field workstation."
For me, the MBA13 would be perfect if it came with a matte screen, mSATA SSD, and 4GB onboard+1 open DIMM slot. Apparently that's "not the way Apple does things". They think glossy is best, end of story. By design they discourage users to buy memory and storage from reputable brands such as Kingston, OCZ, etc. Their iOS products do not take microSD cards even though it's the most obvious benefit from a user's POV. Apple is a highly profitable company by their consistent and persistent greed.
I think Anand is quite short-sighted to lust for the latest and the "best". If you work for the sake of diversity and technology, you would be honest about your needs and be able strike a balance of good performance, cost, and supporting open industry standards, without resorting to elitest, locked down black-boxes of beauty. If you need a retina display for your work, which I assume is writing this blog, then you are saying you are too good for AMD, Ubuntu, and the rest of the alternatives/underdogs. You are not doing anything for the future of the industry, but merely a slave to vanity and laziness.
Constructor - Friday, June 15, 2012 - link
I think Anand got the scaling mechanism very wrong.When you set it to the scaling equivalent to 1920*1200, it still renders directly to the actual 2880*1800 resolution (if the program hasn't screwed up).
Mac OS X will just provide a different UI scale factor to the Quartz framework for this display than the default 2 (1440 -> 2880), namely 1.5 (1920 -> 2880).
But the scaling appears to happen *before* rendering the actual pixel image, not *afterwards*. Just look at the Retina screen in that scaling mode zoomed up with CTRL-Scroll. No pixel scaling artefacts, but there would be such artefacts if Anand's theory actualy held true.
It seems the Application "sees" 1920*1200 "Points" as its *logical* resolution but when it tells Mac OS X to actually draw text (for instance), Quartz will actually scale it to the real resolution *before* starting to render pixels.
Anand's theory would amount to a massive waste of CPU and GPU power, massive waste of RAM (oversized intermediate pixel buffers) and compromised output quality.
But apparently output is always fast and perfectly crisp regardless of how you set the scaling slider.
Only when you run into misbehaving apps which make incorrect assumptions about the physical display resolution (ignoring the scale factor provided by the API), you'll see pixelated output, and in *that* case the medium setting provides the least annoying pixelation. Properly behaving apps seem to scale perfectly in all settings, however, as far as I've seen on the demo machine I've been examining.
Anand, could you please re-examine this?
I'm not *that* deep into the APIs, but the (somewhat) recent change to floating-point "Point" coordinates at the introduction of the iPhone 4 Retina display (instead of integer pixels before) and the "Scale" factor for each display is pretty clear about how this is done.
joevt - Friday, June 15, 2012 - link
Open the screen shot .png file in Preview and view it at Actual Size. The screen shot is 3840 × 2400 pixels. That must be the size of the screen buffer. That is then scaled down to 2880 x 1800.It's just like what would happen if you were to set a normal laptop that supports 1440 x 900 to a resolution like 1024 x 640. The screen buffer is 1024 x 640 and is scaled to 1440 x 900.
3840 × 2400 isn't too unreasonable. It's only 36 MB of the 1 GB VRAM. Graphics cards can support 4k x 4k textures which is much larger.
You don't see any pixel artifacts because the screen shot does not show what it would look like at 2880 x1800.
Constructor - Sunday, June 17, 2012 - link
The full-size screen shots are indeed a plausible argument.I've been used to the pixel zoom (CTRL-scroll) zooming into the *physical* pixel map, but on re-examination it seems that it zooms into the *virtual* pixel map on the Retina Display instead, so the lack of any visible scaling artefacts there is indeed *not* an argument against pixel scaling in the non-native scale setting.
With the limited array of software on the demo machine I couldn't verify this exactly even with an optical loupe, but the difficulty to discern any artefacts even under the loupe in all of the available scale settings basically removes quality degradation from being a practical concern.
So at this point I retract my earlier assertion that pixel scaling was *absent*. It seems that in the non-native scale settings it is being used to handle the scaled modes, but the main mechanism is still the logical Points coordinate system being independent of the physical resolution – that is why most well-behaved apps should not need any adaptation (apart from improved icons).
pcgeek101 - Tuesday, June 12, 2012 - link
I'm with you, Jeff. I'd literally buy a PC version of this today, if it were available. My work is tied to the Microsoft platform though, so this wouldn't really be a realistic purchase for me, especially at such a high price.repoman27 - Tuesday, June 12, 2012 - link
It's a PC that happens to be made by Apple and comes with Mac OS X pre-installed, but there's nothing stopping you from just running Windows on Apple hardware these days.sonelone - Tuesday, June 12, 2012 - link
Could you also do some of the gaming benchmarks with boot camp windows 7? Battlefield 3 doesn't work on osx.eanazag - Tuesday, June 12, 2012 - link
First of all the macbook of any type is not a gaming rig. I don't think those who have been criticizing Anand really even know what they're saying or have really read anything on this site, nor play many Blizzard games. 20 fps at 2880x1800 is pretty good for any machine. It is anecdotal and posted in the Pipeline. I am impressed a 650m is even pushing any game at 2880x1800.I have the first two Diablos and it goes without saying that as you get to the end of the game it starts to throw crazy numbers of foes at you and therefore more taxing on the system. Now I also have SCII and it runs at ~25 fps on my desktop and I never thought anything of it. Blizzard doesn't program they way most game developers do. They often don't have high requirements for hardware and they typically make 3rd person games.
I can understand the slider approach when not all applications have been updated to support the screen. Truth be told Macs are sold to old and young people and that native resolution is not going to fly all the time on a 15' screen. I would guess 90% of Mac users are not even going to change the res from 1440x900. They're gonna talk about how good the screen looks.
Disclaimer - No fanboy affiliation. If I lean towards any hardware, it would be AMD. Unfortunately they break my heart more often than I can tolerate and I buy according to what makes sense for me. Ok, I am an SSD fanboy - they have made all my machines so much more usable.
desta23 - Tuesday, June 12, 2012 - link
It's called the Macbook "PRO" --Professionals might actually run things like CAD or other graphically intense applications other than games. They might even want to run them at their native resolutions....games are simply a good way to stress the capabilities of a system. Usually if it plays games well - it does everything else great...
How exactly does this higher resolution screen benefit professionals again - by sapping more of their profits with its higher cost?
tim851 - Tuesday, June 12, 2012 - link
What does a CAD application have to do with desktop resolution?And who is forcing professionals to buy Apple? It's not really a secret that Apple has been far from being a bargain vendor.
This website is insane.
Every. Single. Laptop review gets spammed with "need more resolution"-comments and now that somebody does extremely high DPI, people have to be negative about it.
I guess the web made it cool to be "anti" everything.
seapeople - Tuesday, June 12, 2012 - link
I'd also like to add a couple more ubiquitous complaints:Display reviews: "TN panel - DON'T CARE"
Laptop reviews: "Base model includes 5400 RPM hard drive - DON'T CARE"
Laptop reviews: "Still too thick with ancient optical drive tech - DON'T CARE"
And so Apple offers an IPS panel, high performance SSD, and ridiculously thin profile for corresponding specs for a relatively cheap price, and people complain that the display resolution is too high! Meanwhile, you can just run the stupid thing at 1440x900 and still have all the other benefits... and yet we get 90% complaints.
XonicEQ - Tuesday, June 12, 2012 - link
In the diablo screen shot, upper left corner says FPS 27. With nothing but player character and a merc on screen and it's only 27 frame per seconds. It's barely playable at this frame rate, with mobs on screen it'll be a slide show.FiReBReTHa - Tuesday, June 12, 2012 - link
I was expecting the Sony Z to get thrown in the mix to compare up figuring this would be high end display testing.MobiusStrip - Tuesday, June 12, 2012 - link
"Apple chose to deal with the issues of glare and reflections by integrating an extremely bright panel with great black levels."In other words, they chose to continue peddling shitty, glossy screens, taking cues from the third-tier schlock being sold at Best Buy.
"Great black levels" don't mean jack when the screen is covered by sheen of reflection in all lighting conditions. Even Apple's own marketing shots show the "deep blacks" being washed out by the asinine gloss on all of their products. Are these the "great black levels" we're supposed to be all jizzed about:
http://images.apple.com/macbook-pro/features/image...
optics261 - Friday, June 15, 2012 - link
anti glare is achieved actually by adding a layer of film to the display. every display that has this layer shows white through a 'filter' and results in 'sparkles' rather than true white. Taking this display and throwing some anti glare goo on there would be like taking a ferrari and getting some home depot paint to change the color.for the best experience display should be in a fairly dim area away from light sources which cause reflections.
when that cant be the case, you can apply the anti glare film, but then you are basically never going to have a clear picture again.
or just go ahead and apply the crappy stuff yourself rather than at the factory.
http://www.amazon.com/Anti-glare-for-Newest-Macboo...
pcgeek101 - Tuesday, June 12, 2012 - link
This is just maddening .... what the hell is wrong with PC laptop manufacturers? Given that PC laptops are not restricted to the control of Apple, there should have had MASSIVE display resolutions YEARS ago, before Apple came out with this.I'm seriously tempted to switch to Apple just because of this, but reading about all the "restrictions" is really irritating, too. Why can some applications not "see" the 2880x1800 resolution? Why doesn't Apple use standard 2.5" SSDs?
There will never be a "best of both worlds," will there?
felipecn - Tuesday, June 12, 2012 - link
2.5" is too much for SSDs. I really like the "SSD stick" that Apple introduced with MacBook Air and would be really nice if it was a standard.And "Retina aware" apps can use the full resolution. Final Cut (or is it iMovie) can show a full 1080p stream on preview mode, something it couldn't if was restricted to a "virtual 1440x900 mode".
The OS just don't render everything at 2880x1440 because UI would be really small.
tipoo - Tuesday, June 12, 2012 - link
Standard SSDs would take more space, plus they can charge more for upgrades if you can't get one (yet) yourself.Windows 7 doesn't scale with DPI very well, Windows 8 will, maybe once that hits we'll start to see similar things on Windows PCs.
foxyshadis - Tuesday, June 12, 2012 - link
Games have always had a bad habit of coding for specific resolutions instead of adapting for any available resolution, so it's no surprise that they're the worst offenders. Changing aspect ratios is even worse. It seems like even AAA games are made by companies who all have identical monitors.Sfasciacarene - Tuesday, June 12, 2012 - link
A question, but resolution 1440*900 is supported? becouse it is exactly half of 2880*1800, and 1 pixel is 4 pixel; so if you have a problem with a game you can go down on that resolution without problems about displays with the non native resolution.Another question, 1680*1050 and ect, it is a resolution scaling like Windows (100-125-150% dpi)? or is a really change of resolution?. If it isn't, how the display works at this resolution, is possible to use it?
Becouse if i use Windows or game (i understand this isn't pc for gaming, but sometime i like to use it and i think use osx + Windows can give better experience), this thing become more important, especially 1440*900 non retiana.
foxyshadis - Tuesday, June 12, 2012 - link
The game can support whatever resolutions it wants, then it will use bilinear or bicubic resampling to get down to that size. 99% of games will go with native resizing support - and aside from text, it'll look better, because fuzzies look better than jaggies - and retro style games will already be using their own engines to make the screen look like it should.Sfasciacarene - Wednesday, June 13, 2012 - link
Yes, but i ask how the resolution look like, try to use 1280*800 in a MacBook with 1440*900 native resolution and you see everything wrong becouse 1px is rappresented with 1.xx pixel . Using a 2880*1800 can give you the possibility to use a 1440*900 without a problem becouse 4pixel become exactly 1pixel and immages and text is on correctly focus like a display with 1440*900 native resolution.And other question, is how the other RESOLUTION (no scaling) looks like.
Sorry for my bad english.
commooncents - Tuesday, June 12, 2012 - link
really? buying 2500 laptops to play 80 dollar video games. Sorry but im growing truly tired of reading comment after comment about expensive hardware and its gaming capabilities. Its not just this site. People actually do more important things with computers, believe it or not, than playing games.audio editing
video editing
photography editing
computational fluid dynamics
CAD - computer aided design
web design
software engineering
live conferencing
etc etc etc
if you want to play games... buy a Play-station. You'll save yourself about 2200 bucks..and have a much better experience. Thank you, stay in school and say no to drugs.
kensama - Tuesday, June 12, 2012 - link
There is no way playing games with gt 650m for that resolution. There is no reason buying this model for just gaming purposes.xp3nd4bl3 - Tuesday, June 12, 2012 - link
I don't understand why all the bashing about an underpowered GPU or overly expensive display.Game in Windows at 1440x900 and you *shouldn't* have scaling artifacts and you've got an equivalent experience to an Alienware M14x. Is that a "joke" of a gaming machine? No, it's pretty legit.
Do anything non-gaming and take full advantage of the pixels - images AND text will look spectacular.
All in an enclosure that is lightyears ahead of the competition. It's not cheap, but you get what you pay for.
"If you have the means, I highly recommend picking one up"
RedWingBlade - Tuesday, June 12, 2012 - link
Hi Anand,Could you compare the new MBP display with the existing Dreamcolor and Premiercolor displays offered respectively on HP and Dell's existing workstation lines.
Thanks.
sonelone - Tuesday, June 12, 2012 - link
I just have a few requests for the full review:1. Heat and noise
2. Bootcamp gaming and application performance
3. Speaker Quality
wfolta - Saturday, June 16, 2012 - link
It runs considerably cooler than my 2010 MBP 17". As an example, I fired up Diablo III at 2880x1800 and with most settings at High, plus AA on. I'm only in Normal Act III, but I was able to play it comfortably, with it dropping to 20 FPS when there were a couple dozen monsters onscreen. From what I understand, dropping the resolution would increase the frame rate, even though it's still ultimately a 2880x1800 display.This was fairly nice, but what actually impressed me was that I ran this test with the machine on my lap and it did not get uncomfortable. My 17" MBP can be uncomfortably hot under much lighter of a load.
And, the fan noise is much more of a white noise, rather than the typical turbine-spinning-up sounds from laptop fans. I had to put my ear close to the screen to make sure that's what it was.
It's a very impressive machine that seems to run cooler and quieter than its larger predecessors, yet have more CPU and GPU power.
mmoy - Tuesday, June 12, 2012 - link
I have a co-worker that recently bought a Vaio Z and now wants one of these. I have another co-worker that wants to know what the heat output is like while playing games. My personal question: how does the console window look with really small fonts?tipoo - Tuesday, June 12, 2012 - link
I'd like to know how that much hyped new cooling system works too.wfolta - Saturday, June 16, 2012 - link
In my brief experience, the cooling plus the new generation of processors is amazing. I can keep it on my laptop, even when gaming, and it doesn't get as hot as my 2010 MBP 17" does. The asymmetrical fans do make a big difference as well: the noise is more of a white noise which is noticeably subtler than older fans.Another factor that's new to the Mac (at least since my 2010 model) is that it switches between Intel and GPU graphics on the fly, which also probably helps keep things cooler than if you had to choose 100% GPU.
A quick test: I ran a process in R that pegged all 8 "cores", and by about 30 seconds, the CPU temperature was up to around 200 F, by about 45 seconds the fans had spun up to about 3,000 RPM and were definitely audible, though more of a white noise. I stopped the process at that point (the fans spun up to around 4,000 at their peak), and 40 seconds later, the temperature was back down to about 140 F. All of this while the machine was on my lap and it got warm, but not uncomfortable.
AnTech - Tuesday, June 12, 2012 - link
No matte display, no purchase. At least as an option. Even if more expensive. Sign the petition at MacMatte (matte petition) http://macmatte.wordpress.comfoxyshadis - Tuesday, June 12, 2012 - link
Correct me if I'm wrong, but that "resolution" setting that you keep talking about isn't actually physical resolution at all, is it? It's basically equivalent to Windows 7's font size control, but with more predefined options, and it starts out at a good medium for everyone instead of at the smallest possible size. It doesn't appear to label them with numbers anywhere, so did you just best-guess your way to the numbers? It sounds like no matter what setting you select, the panel will always run at its maximum resolution, it just changes the size of font and UI element rendering. It also makes absolutely zero sense for Apple to render anything at double resolution as well - unless they're doing FSAA, which would be dumb for text, so that's a silly thing to speculate about.It seems either disingenuous or dumb to criticize Apple for not "making the resolution available" when you don't even understand what the feature is. No one can actually read text that's 10 pixels high when you have 200 ppi, so there's no point in offering that setting.
JasperJanssen - Tuesday, June 12, 2012 - link
No, this is not font size settings, this is the resolution slider. It's in the same place as the resolution slider used to be without a HiDPI screen, and it says right on the left side in the fricking screenshots what the resolution is.It seems really dumb or disingenuous to criticize Anand when you clearly don't have a clue what you're talking about.
Constructor - Friday, June 15, 2012 - link
I've played with a demo machine quite extensively and I'm pretty sure the slider does *not* set an actual "resolution".It just adjusts the purely virtual base coordinate system to one of several possible settings.
The actual drawing is *not* done at those virtual "Point" resolutions, however, but is actually done at the *physical* resolution the virtual "Points" coordinate system gets mapped to. By well-behaved apps which use the provided "Scale" factor properly, that is. Those which ignore the scaling mechanism which has been there for a while and still insist on generating background pixel images themselves will result in pixelated output.
So you can set the equivalent of 1920*1200 from the application's point of view, but the actual rendering still happens at 2880*1800, just scaled differently (but mathematically *before* actual pixel rendering, not afterwards!).
So the scaling slider only adjusts how big the menu bar, other UI elements and window contents are being drawn, but not the rendering quality, which always remains at maximum (again with well-behaved apps).
adfrost - Tuesday, June 12, 2012 - link
I've got the 15" 1680 x 1050 Hi-Res Matte screen and my first concern was going from that to the 1440 x 900 equivalent scaling. Glad to see Apple has the option to run at a 1680 x 1050 equivalent if I so choose. I haven't decided if I'll get it or not, but I'm definitely going to check one out in person.agent2099 - Tuesday, June 12, 2012 - link
The simple fact is if 90% of the things we do on the laptop cannot use a retina displays full Resolution, Apple should have made this laptop 1900 x 1200 native resolution. What most people do on a laptop is read and write text. I would rather do that at the native resolution than at a downsampled resolution.metatron76 - Wednesday, June 13, 2012 - link
@The simple fact is if 90% of the things we do on the laptop cannot use a retina displays full Resolution,You seem to misunderstand one of the following terms (resolution | scaling | pixel density). I'm not sure which of these is tripping you up. I'm also not sure what you mean by "things we do on the laptop". If you you are referring to correspondence (e-mail, social web, chat), productivity (word processing, spreadsheet, form based data), and recreation/leisure (web browsing, gaming, streaming media), then 100% of your apps can take advantage of the retina display.
Text, graphics, and window elements will be the same size as on your old MBP, but double the resolution. Applications without high resolution resources get pixel doubled until an update is available.
curiousjosh - Monday, June 18, 2012 - link
it seems the misunderstanding may be yours.he's wanting to work at 1920 resolution (an option offered by apple).
at this resolution, there's not a 1:1 correspondence with the 2880x1800 screen, and everything will receive some resampling unless it's an element that can be drawn using apple's new retina code.
even with the new screen rendering options, you have to buy all new applications to get the versions working with retina (if they even release this year, even adobe's going to take a while)
so for now, 90% of things can't use the retina advantage.
And things like web graphics will always need to be resampled when displaying on a retina if you're in any resolution other than 1440x900.
it's a clusterf--k.
icyarthur - Wednesday, June 13, 2012 - link
I totally agree with the dream setup your mentioned (lower-end 15" retina plus 512GB flash). Apple's decision to opt-out the storage upgrade option really disappointed me. I know there are some workarounds like hooking up a thunderbolt external HDD for file storage. However, that means I have to carry it around and hook it up which is extremely inconvenient. After all, there's no point to get this MBP over the original 15" if I have to lug around a drive with it, right?metatron76 - Wednesday, June 13, 2012 - link
Anand,It looks like you're using AA when running at 2880x1800 (you have AA enabled in the screenshot).
When running at the native resolution, there isn't any aliasing to contend with, so you don't need it. Disabling AA will likely give you back 15-30% of your performance.
GrayParrot - Wednesday, June 13, 2012 - link
Can the retina display be reduced to 1400x900 in system preferences to lower the strain on the 650m for 3d programs like Maya, modo, and Cinema 4D? I'm fine with working at the lower resolution while in these programs, then resetting to retina for everything else. But the mention of a performance hit scares me as 3D programs already tax the graphical capabilities of mobile gpus at lower resolutions.akshay_sug - Wednesday, June 13, 2012 - link
First update of MacBook within 48hrs-TrackPad 1.0.......read much more at my bloghttp://abandndtombstone.blogspot.com/2012/06/apple...
hasseb64 - Wednesday, June 13, 2012 - link
Will buy a "Ivy Bridge" laptop for sure. This MAC is a very good option I'll think.Have everything I wished for, a size 14 would have been perfect!
Currently using an old Sony Z 13" and don't see anyone in PC biz that can compete with Apple atm, poor SONY that neglects software updates, fills computers with BLOATware and uses stupid RAID SDD soloutions in new Z series.
XZerg - Wednesday, June 13, 2012 - link
bawawahahahahahaah. iCrap strikes again.SoundChaos - Wednesday, June 13, 2012 - link
I really don't see where any of these negative comments are coming from. Not that I'm an apple fanboy or anything, I build and repair PC's for a living, and it would be a nightmare if someone brought this thing to me, but thats to be expected. But seeing all these comments such as "there is no usability improvement by increasing resolution" or "why have a discrete GPU for anything but gaming" or "1gb of vram is not enough!"I also do a lot of enthusiast photography, which apple is a VERY popular name when it comes to this, the increased resolution is more important than almost anything else. The same goes with video editing and graphic artists. These are major fields for apple computers, and this is a huge improvement. After using the new iPad, there is no way I can go back to viewing and editing photos on a sub par screen with lower resolution, this laptop is going to be a godsend.
Also, the discrete GPU is a huge impact on a variety of applications other than gaming. Have you ever tried to switch between integrated and discrete graphics on laptops that have hybrid? Or even more specific, the 2008 MBP with the 9600GT? It slows down photoshop, aperature, all video editing software, and even picasa! Also with macs, its not like a pc where it needs the firmware modifications of a professional grade videocard that costs 10x the amount. A consumer grade videocard will give a mac every bit of 3D rendering capabilites in all CAD programs as a Quadro, minus the boost in video ram. And as some other posters pointed out, the 1gb of ram in this card is NOT the bottleneck in any gaming or consumer use, its the slow speed of the 650m, and of course there is NO WAY to cool a higher end card in this form factor without drastically increasing fan noise, which apple will not accept.
By the way, on my desktop Diablo 3 runs at 20-30 fps most of the time, and only goes to 60 in certian scenarios, and im using a 4.8ghz sandy bridge i7 and a GTX 680.. its completely playable, its part of the games engine. it only gets unplayable and glitchtchy when jumping between 10-30 back and fourth on a lower end comptuer. (which this laptop would be considered a low end gaming desktop in that respect) Im sure playing at full resolution would be plenty playable, just quite as smooth as 1440x900
curiousjosh - Monday, June 18, 2012 - link
As a photographer, this laptop may make me leave apple.I need more screen real estate than 1440x900. there's just too many palette's/ dialogs/ programs I want to work with.
And even apple says right in the mbp's screen resolution dialog that using 1920x1200 on this laptop isn't a good idea.
Looking at the images here, I can see exactly what I was worried about. At 1920x1200 web pages look like crap because all the graphics are being resampled to display on a 2880x1800 screen.
Good one apple.
take away a true 1920x1200 resolution laptop.
and getting rid of the 17"? really? yeah... good one.
SignalPST - Thursday, June 14, 2012 - link
Looking forward to see the color gamut results!!!IPID - Thursday, June 14, 2012 - link
I had C2D and Sandy Bridge MBPs. Under heavy CPU load for 5-10 minutes they noticeably throttle the CPU which defeats the purpose of getting a fast CPU. They hit the Tjunction temp very quickly. Has thermal dissipation improved in this year's MBPs?likethesky - Thursday, June 14, 2012 - link
As raised by other readers in these comments (@The Von Matrices, @B3an, @EnzonFX, et al.), would you (Anand, or anyone else that might know) recommend waiting a bit before purchasing an Apple Cinema Display because it's likely Apple will be providing a much higher resolution ("retina" or otherwise) on their Thunderbolt 27" display, or will it be at least the better part of a year or more before they could possibly do anything approaching this resolution in a 27" format?ppayne - Thursday, June 14, 2012 - link
I played with one today, it's gorgeous of course. Thoughts:a) I have a sweet early 2012 Macbook Pro already, with 240 GB SSD and a 1 TB hard drive where the CD-ROM drive went. This is sweet. I spent 2 months on a road a year (in the U.S. as I am usually in Japan) and having that extra space for movies, iTunes, Dropbox files coming in is good, switching to 768 GB SSD only would be hard on me.
b) I am nervous about slow-to-update apps. It would not be unlike Adobe to say, Upgrade to CS6.1 to get the new features like support for Apple's new resolution, and I just upgraded. That said, I'd likely run it in smallest-test mode since it's not that hard to read and I would not notice any resolution issues that way.
c) I am really looking at a holistic upgrade. I use my laptop heavily, doing hardcore business stuff with it, but it's attached to an Apple 27 inch display much of the time. Give me a new Thunderbolt display with higher res, USB 3.0 and all that, and we can talk.
d) Apple is likely, hopefully, at the beginning of a major roll out of new stuff with higher resolution, I want to see where that goes a bit.
I won't be upgrading, but as an Apple shareholder I recommend that all of you do. ^_-
tmpatrick - Thursday, June 14, 2012 - link
I always thought it was me (or my card) before, not the slot in the MBP Early 2011 that was a problem. I wonder what the workaround is supposed to be - repeated insertion, or applying pressure?One question: I noticed the Kensington slot is gone from the sides - has it moved to the back, or is it just gone?
Heathmoor - Saturday, June 16, 2012 - link
It's definitely gone. I would try using the reader card slot instead, perhaps it works.joevt - Saturday, June 16, 2012 - link
I did some experiments on my Mac Pro with ATI Radeon HD 5870 1 GB and Apple 30" Cinema display (2560x1600 native) running Mac OS X 10.7.4.I was able to use SwitchResX to create and use scaling resolutions up to 4096 wide or 4096 tall but the screen buffer was limited to under 32 MB which means the highest 16:10 resolution I could create was 3584 x 2240. I guess this is a limitation in the Radeon driver since the MacBook Pro with Retina display can go higher. I wonder if it can go higher than 3840 x 2400?
I was able to use the Mac with a desktop that was 3584 x 2240 which is scaled down to the native resolution of 2560 x 1600. I was also able to play games at 3584 x 2240 (a cheap or expensive anti aliasing method but it makes the game cursor even tinier). Screen shots were also 3584 x 2240.
SwitchResX creates the resolutions by editing a file in /System/Library/Displays/Overrides corresponding to the display. It also adds contextual and menu bar menus for changing screen resolution.
I used Quartz Debug.app from Xcode 4 to enable HiDPI display modes. These HiDPI modes are one quarter (one half vertical and horizontal) of any resolution that is greater than 1600 wide. I believe these modes are what the MacBook Pro with Retina display uses. I wonder if these SwitchResX menus can be used on the MacBook Pro with Retina display to select modes that are not HiDPI?
I tried the various HiDPI modes including 1792 x 1120. The text does seem smoother than if I used a non-HiDPI version of the 1792 x 1120 resolution. Of course the best HiDPI mode for the 30" Cinema Display would be 1280 x 800 but it doesn't leave a lot of screen real-estate. 2560 does not divide by 1.5 evenly so you can't get a mode that would look exactly like the 1920 x 1200 HiDPI mode looks on the 2880 x 1800 Retina display without some extra work.
Heathmoor - Saturday, June 16, 2012 - link
Will be the wires of this "new" Magsafe 2 be also prone to get fried and the first Magsafe (MPM-1) that Apple included in their laptops from 2006 to 2009?http://en.wikipedia.org/wiki/Magsafe#Criticisms_an...
BTW, the wires of my power adaptor got also fried, but in this case near the adaptor end rather than the Magsafe connector.
mashkax7 - Tuesday, July 10, 2012 - link
It seems that apple has taken away some ports, including the audio line in that the previous MBP had. Does the headphone port also support audio line in?markofinsanity - Monday, July 23, 2012 - link
Yes, I believe it does.I was on a conference call using Skype with my iPhone headset plugged into the single jack, and my colleague seated nearby complained that I was making scratchy noises when I moved the mic around and it touched my face; he could see and hear it happening.
MacWorld says the same thing at http://www.macworld.com/article/1167249/answers_to... -- minus the anecdote...
markofinsanity - Monday, July 23, 2012 - link
Bottom line: MPBr display is overkill for business app use, but looks better than other physically larger external monitors.Monitor 1: MBPr set to "best for Retina"
Monitor 2: LG Flatron M2380D 1920x1058 connected via Mini DisplayPort to VGA (yeah, MDP to HDMI SUCKED, had bad underscan at default setting, and terrible pixelation/aliasing (? I don't know the right term, several commenters here will blast me for it, whatever) making the fonts look bad at any underscan adjustment level)
Monitor 3: NEX MultiSync 90GX2 1280x1024 connected via Mini DisplayPort to DVI
MBPr display is really only usable at the "best" setting; scaled to "larger text" reduces real estate too much; scaled to "more space" makes stuff too tiny even for my good eyesight. And BTW, Safari (for example), which is supposed to be optimized for MBPr, behaves exactly the same as every other app I put on that monitor, whether it's a Mac OS app or a Windows app like Outlook running via Parallels. Definitely looks crisp, though.
LG M2380D is worst of the three, although it obviously has a large physical presence. Text and images simply not crisp, but at least using VGA it's usable. I just don't get why HDMI didn't work, whether from HDMI to HDMI, or MDP to HDMI (I tried both).
NEC 90GX2 is pretty good. Glossy like the MBPr, which I like, and pretty crisp.
Note: Weird that the LG's resolution is 1920x1058; I think the monitor's is actually 1920x1080. I was going to say that maybe the missing 22 pixels were taken up by the Mac menubar (as I got the resolution figure from the Windows Display control panel in Parallels), BUT, under that hypothesis, the NEC should also have had a deduction (since it too has a menubar thanks to Multimon). OTOH, maybe the LG registers the loss because that is the primary display in the Mac OS Display System Preferences, while the NEC is not, and maybe Multimon's second menubar on the NEC is seen differently by Windows. I'm way past my ability to provide any more than conjecture, however. :-]