I’ve mentioned this game before on this site, last year when it hit the floor of E3 and i thought, now THAT’S a game i’m going to enjoy playing. Well, it wasn’t, frankly at the moment i refuse to purchase the thing until it has it’s “functionality” patch released. If you’re not familiar with the state of watchdogs for PC, i will return to it later below in more detail. But for now… What am i blathering on about.
currently the Ubisoft Team is under heavy fire and fierce accusations of not only releasing a non-functional game, but of also gimping an already disfunctional game so that images like this one could be gathered. Normally conspiracy theories are just that, theories, but this is an instance, as an experienced PC Gamer, I am qualified to point out a few trend defying qualities posessed by this title, which go directly against statements released by ubisoft’s PR team and developers alike.
Before I state my case and opinion, there’s a thorn in my side that must be addressed. And that thorn which is being yanked in many directions currently, and that argument focuses on one point of contention, RESOLUTION. There is a fierce debate currently over whether these things matter, but i don’t think people really understand what these values actually MEAN to the player.
Lets start with RESOLUTION. for the sake of fluidity, lets start at the bottom, As many of you know, when you buy any sort of digital screen, be it TV or Computer screen, it lists a resolution, monitors come in all sorts of resolutions, but tv’s usually stick to 2(3) options, and easily enough the resolution is the number of pixels on the screen. Pixels are of course the little “dots” on your screen that change colors to create images. If you look close enough at your tv it is pretty easy to see each individually defined pixel.
1080p – 1920×1080
(4K) – 3840 x 2160
all of these list the width first, and then the height. So if you’ve got a 1080p tv, that means there are really 2073600 tiny dots on your tv that can each represent only 1 color at any given time. A screen may list several “supported” resolutions, but what matters is its “Native resolution, because that’s how many dots it actually possesses.
Okay now that we’ve all just remind ourselves how TV’s work, lets think about how resolution is displayed on two separate Tv/s first, a 720p tv and then a 1080p screen.
First the 720p, You’ve just purchased your new 720p tv on sale at the store and brought it home, and hooked it up, you plug in your console, which auto-detects and puts out image information similarly scaled on a 720p screen. Things are great and kosher, everything matches up, a green pixel from the screen pops up as a green pixel on the screen.
Now you’ve upgraded, you went to best buy and there was a nice big 1080p tv on sale and you just had to buy it. You get it home, you hook up your blue ray player, toss in a blu ray, and wow, check out that detail, once again the 2073600 dots the blue ray player is transferring lines up perfectly with the 2073600 dots on your tv. “This is great”, you say as you witness your pristinely displayed media. “Can’t wait to see how great my games look on this beautiful new tv.” You rush to connect your console and toss in a game of your choosing. You start to really get critical and finally state out loud. ” you know… i think my old tv actually looked better.” you’re not crazy, you’re not being hyper critical in your concerns of buyer’s remorse, You’re observing what’s called image scaling.(Or if you want to impress your friends, it’s actually properly called interpolation) Depending on your tv and the game console, there are 2 types of interpolation, Adaptive, and Non adaptive.
non-Adaptive interpolation simply tells the source to find the closest mathmatical resolution to the screen, in this case telling the source to output 990×540 (540p) if you had a tube tv over 42 inches this resolution will look familiar to you. Now if your tv has a built in up-scaler it will then explode the image to fill the whole screen (in a checkerboard pattern) and then use math to guess what color the pixels in the middle should become. ( if it is between two black pixels, it would be made black, a black pixel next to a gray pixel fills in with a gray for example.) If like most tv’s it does not, then it just makes every 1 pixel from the source dispaly as 4 pixels on your screen, bundling them if you will to output at 540p.
in either of those cases, yes the image quality is noticeably affected because less original information si being sent from the original source creating less detail.
More common is Adaptive scaling which does mostly the same thing, but uses complex algorithms to use non mathematically perfect resolutions, such as 720/900p on this new 1080p tv, and instead of exploding the image like a checkerboard, it draws an imaginary grid with it’s native 1080p resolution and then takes the average color of the information in each grid space, and tells the pixel that grid space represents to display that averaged color. This results in blurriness and loss of color fidelity and consistency, as a color average can easily be skewed by things like black outlines or white spots.
In either of these cases, yes, as a matter of fact your old tv did do a better job of accurately displaying its source material.
HOLD UP, Did i just say that a 720p output could look better than a 900p output? Well yes and no, from a perception of detail standpoint yes, the perfectly matched output will display the image in true color, as the game developers designed it. lines will be sharper, colors will be richer, detail will be more pronounced. The human does a much better job of blending contrasting colors than the algorithm can. That said however, the 900p image on the 1080 screen will look “smoother” because of the blurring affect the algorithm creates.
the real point here, is that a screen will always always always look best at its native resolution. a device can only display true color at it’s native resolution or at a perfect factor (1/2,1/3,1/4) of its resolution.
Because of this jump in the xbox era to the 720p era in the 360 era, in which game output resolution rose to perfectly match screen resolution, we saw the clarity of “hd gaming” Right now though, that’s not the case, display technology has progressed beyond what these deviced already fail to deliver, and therefore, because of all the work their up scaling tv’s were already doing, Many are not really seeing big differences in image reproduction betwen their ps3’s and ps4’s they were expecting to. That said, resolution isn’t the only thing that matters in a game, there are lighting affects, and filters, etc that also have been given improved breadth with this new generation.
so now, you’ve soldiered through all the required reading for my point ( or skipped it because you either already understood or didn’t care) Lets take a look at this watchdog screenshot shall we? Awfully similar, no? I mean really, really similar. if you were to look at the two and give an answer which you thought looked “better” which would you say?
Got your decision? what if I told you I manipulated this image and swapped the labels? does your decision stand?
Compared to e3 gameplay video they presented in 2012, this is a little bland isn’t it? Does it strike you as odd that a pc with roughly 5-10 times the graphical chops displays displays such a similar image. It’s almost like the game was designed with a visual target in mind, right around the edge where the PS4 peters out. a machine can can only display the image it’s given, nothing more, nothing less.
That’s a pretty lofty accusation on my part though, to have a developer deliberately shoot low, to intentionally reduce the graphical prowess of a work in progress and settle for what they expected the ps4 could handle, It’s not like they had demos or dev kits that possessed e3 demo quality graphics and effects, or had already made all the shaders and effects they dropped for release. And of course We must all remember that according to CEO of Ubisoft as well as the dev team themselves, Pc was the lead platform on this thing, It was created specifically to drive a pc to it’s potential and then be ported to consoles with tweaks for optimization. I’m sure they wouldn’t simply lie about this to appease hungry pc gamers, and there’s nothing we have aside from the game performing like garbage because the game engine was specifically designed for the shared VRAM configuration on the XBone and the PS4, to tell us otherwise.
by the way, that last paragraph was all sarcasm, every single one of those things happened, and exist, in fact, a mod was released a few days ago saturating the internet with headlines because with a mere 3.5kb of tweaks to the game’s config files, hidden inside the title were those graphics we saw at E3. Particle effects, shaders, artifacts, clearly hundreds hours of work, locked away inside the game.
Of course the question on everyone’s minds is “why?” now I have to disclaim myself right now, and say that this surely isn’t the dev’s fault, all of the issues that we are seeing are clearly matters of management, and allocation , and not “poor” management, These employees are clearly not happy about the situation wither (indicated by their silence, usually you get at least one or two employees that will anonymously shed some light on the situation. It’s beyond question that the dev’s NDA leashes are eing pulled tight right now, and Ubisoft is trying to leave everything to it’s PR department, as it’s their mess to clean up as usual.
As an amateur programmer, and full time Sys-admin with a little experience in the corporate world, I’ve come up with 2 theories.
1. the conspiracy theory, That Sony made it worth Ubisoft’s while to release a game that they could market on their system without being dwarfed by screenshots of the pc version in it’s E3-like Glory. As much as Console enthusiests say “graphics don’t matter” the worse-modded game looks completely different from the vanilla screenshots.(reminds me of that wii version of NFS hot pursuit they did, it was so comical that nfs never released ont eh wii again.) If the brand new ultra-super-mega-buy-me-buy-me-crystal-clarity-super-consoles came out with such a huge visual disparity this close to launch, there’s a genuine risk people might not buy either console and instead wait and see what the steam boxes have to offer. That’s kind of a long shot, For a developer to risk its reputation to feed the boss’s pockets, though i’d be interested to hear exactly how much money changed hands between Sony and Ubisoft, cause Sony somehow also got movie rights to the Watchdogs IP.
Eh It’s not unheard of, but I’d like to think that sony wouldn’t make such an obvious move.
2.) The more likely theory i’m betting on. The PC version was having serious problems nailing down total stability with its own version, and at some point either before, after, or during the intermediary time the game was delayed, the preliminary PS4 port was the only stable version ( indicated by the original ps4 Dev kit) The Ps4 devkit for those unfamiliar with corporate game developement, is designed to a target spec that will look best on the hardware, so unlike PC dev where they go balls out and say ” here’s how we drew it originally, and here are some quality settings you can fiddle with to find the happy median for you”. COnsole dev will take a devkit and build to the spec of that machine, working with the IPto try and get it to look as good as possible with a known performance ceiling.
Usually this console version won’t have much content above what they expect the machine can handle, because disk space is a premium, they then fiddle with the game until they find the ideal combination of smooth performance and graphical fidelity.
What i think happened, and again this is only a hypothesis on my part, Is that at some point in the shadow of an approaching deadline, That the PS4 Devkit’s became the new lead version, and it was ported back to its PC source, and try to tweak whatever they could graphically from there in patches post launch, and that’s why the files for the original lighting effects and engine configurations were left in the game download, so that if they could manage to create a patched version it could be released in a small update, rather than a large patch requiring several gb of data to be transferred, and still release on time. the press embargo on Watchdogs was longer than usual, The blackout didn’t officially lift until 22 hours after “release hour” for pc reporters at least like pc magazine and totalbiscuit. The game would release on time,and they’d cash in on release day hype and sales.
If this actually happened, It explains how something as elementary as the lack of memory segregation could be overlooked this late into the developement cycle, as the PC dev team would possibly not be as familiar with the specialized tweaks made to the (then offshoot) ps4 port. Again if this were to be the case, the only additional options for the pc player to enable would be those that were disabled as tweaks in the Ps4 dev kit. If this is all a true guess, then the Ubisoft CEO didn’t lie when she said PC was the lead platform, they were telling the truth that they didn’t “downgrade” the pc version, or that the files at the time of the re-port were indeed unstable, and unsuitable for release, at least on the original platform.
If importing them onto the PS4 port’s files did work, in some way as they seem to, then it would give us a glimps as to why the base of the pc version was scrapped and transferred to a modified base used by the ps4 port.
Asuming an emergency shift like this, objective 1 once the re-port was done, would be to take out specific coding that was added for the ps4, but without breaking the game engine in the process, all the while applying whatever pc features they could salvage to the modified engine. That too would explain why it has taken so long for a patch addressing the memory split to arrive, and why the dev team has been so uncharacteristically tight lipped in the wake of what was clearly at best a shaky launch.
By the way if you are still reading, i didn’t actually modify that image, but you looked didn’t you, did you believe me for a moment? just an indication of how similar the versions really are.