Closing Thoughts

There are two things you can count on with the fall gaming season: lots of games, and occasionally botched launches as publishers rush to release new titles in time for the peak of the holiday shopping spree. Ubisoft has three major games launching right now, Assassin's Creed: Unity came out last week, Far Cry 4 just released Tuesday, and The Crew launches next week. Obviously, they don't want to launch all three on the same day, but more than one person has come to the conclusion that ACU should have been delayed by a few weeks to get all the bugs worked out.

So far, there has been a Day 0 patch, then the current 1.2, and at least two more patches are planned I believe. The next should provide further bug fixes (and performance optimizations perhaps), while a later patch will also add tessellation support to the game. It's probably a good idea to get performance "fixed" as much as possible before adding tessellation, as it could simply reduce already low frame rates on a lot of systems.

My own experience with Assassin's Creed: Unity has thankfully been mostly uneventful. There was talk about missing textures and "faceless" people, but that's apparently only on unpatched versions – the Day 0 patch addressed that bug, and I know at least in my case I never saw it. Stability hasn't been perfect, but the second patch did a lot to address any crashes in my case – I've played for a few hours several times without crashing, though after a while it seems crashes are still possible.

By far the biggest concern however is performance. I'd say if you can average about 40FPS (with minimums in the mid-20s or above), Assassin's Creed: Unity is playable. The problem is that to get such frame rates, you basically need to go with Low settings on quite a few "midrange" GPUs, and even beefy GPUs like the GTX 980 aren't going to be happy with all settings maxed out at resolutions beyond 1080p. If you have the hardware, ACU is a great looking game and a good addition to the Assassin's Creed series. But for those running older GPUs – or AMD GPUs – you probably want to wait at least another month to see what happens before buying the game.

And if this is the shape of things to come, a lot of people might want a GPU upgrade this holiday season.

Image Quality and Settings
Comments Locked

122 Comments

View All Comments

  • poohbear - Friday, November 21, 2014 - link

    lets be honest, this is a poorly optimized game with an enormous amount of bugs that was so ridiuclously messed up it made the BBC news and their shares dropped 11%! It's a complete debacle.
  • dwade123 - Friday, November 21, 2014 - link

    Good thing I didn't buy GTX 980 for $460. It can't run next genports maxed comfortably. Bring out the real nextgen gpus!
  • maroon1 - Friday, November 21, 2014 - link

    Core i3 4130 with GTX 750 Ti runs this game as good as console version

    Eurogamers did test by matching the graphic quality of PC to console version (by running it with 900p and similar graphic settings to PS4), and the result that GTX 750 Ti plays it as good if not slightly better.
  • cmdrmonkey - Friday, November 21, 2014 - link

    When a game is barely playable on the most high-end video cards on the market at resolutions and settings PC gamers are accustomed to, you have utterly failed. Bravo Ubisoft. Bravo.
  • P39Airacobra - Friday, November 21, 2014 - link

    You can forget about Ubicrap fixing this! This is why Ubicrap gave the unreal PC requirements! They are getting money from GPU/CPU Hardware to help market for them! And they do care to spend more money on us scum customers anyway! So I say XXXXXXXXXXXX UBICRAP!!!!!
  • P39Airacobra - Friday, November 21, 2014 - link

    They should be arrested for doing this!
  • mr. president - Sunday, November 23, 2014 - link

    Any chance of testing CPU performance on AMD vs nvidia GPUs? I've seen a *ton* of recent games underperform on AMD GPUs due to what I think is their lack of support for deferred contexts aka 'multithreaded rendering'. It's particularly low-end CPUs that are affected.

    Unity pushes something like 50.000 draw calls each frame. Note the enormous disparity in minimum framerates between the two vendors on 1080p/medium where even slower nvidia GPUs get higher minimums than faster AMD GPUs. I think it's worth exploring as even low-end FX CPUs can almost double their performance on high-end nvidia GPUs vs. high-end AMD GPUs.
  • FlushedBubblyJock - Tuesday, November 25, 2014 - link

    That last line you have tells me AMD is offloading multiple boatloads of work to the cpu --- isn't that exactly why Mantle is for low end cpu's - it relieves the gigantic overburdening cheaty normal driver of AMD that hammers the puny AMD cpus.

    It's sad really - shortcuts and angles and scammy drivers that really only hurt everyone.
  • RafaelHerschel - Sunday, November 23, 2014 - link

    A few observations:

    60 frames per seconds isn’t some arbitrary value. With Vsync enabled and a refresh rate of 60Hz, dips below 60 fps are far more unpleasant. Adaptive Vsync addresses that but isn’t available to everybody. Disabling Vsync leads to screen tearing which some people (me included) find extremely annoying.

    In a game every frame consists of discrete information. In a movie each frame is slightly blurred or at least partially blurred, a natural effect of capturing moving objects in a frame. For a game to feel fluent at 24 or 30 fps it needs to add artificial blurring.

    In movies each frame has the same length. In games the length of each frame is different. So even 60 fps can feel choppy.

    Different people have different sensibilities. I always notice a low frame rate and frame drops. A steady 60 fps with Vsync enabled works best for me. Anything below 50 fps (in a game) feels off to me and above 60 I don’t notice that much difference. Likewise for gaming and movies I use screens with a fast response time since ghosting really distracts me.

    I feel that with a decent system a 60 fps minimum should be attainable. What bugs me is that in some games lowering the quality settings has little impact on the minimum frame rate.

    I’m always surprised by blanket statement like “30 fps per second is perfectly playable”. Depending on the game, the settings and the person playing the game it’s often not. For me another factor is how close I’m to the screen.
  • JarredWalton - Monday, November 24, 2014 - link

    FWIW, I've been playing games for about 35 years now (since I was 6 on a Magnovox Odyssey II), and when I say a game is "playable" at 40 FPS, what I'm saying is that as someone with years of game playing behind them feels the game works fine at that frame rate. I've also played ACU for many hours at sub-60 FPS rates (without G-SYNC being enabled) and didn't mind the experience. Of course I wasn't the one saying it was "perfectly playable" above, but it is most definitely playable and IMO acceptable for performance. If you want *ideal*, which is completely different, then yes: 60+ FPS is what you want. But then there are those with LCDs running at 120Hz who would want even higher frame rates. YMMV.

Log in

Don't have an account? Sign up now