Quantcast
Channel: Neil Studd
Viewing all articles
Browse latest Browse all 18

Why the plight of rural gamers should concern web developers too

$
0
0

This week sees the eagerly-awaited release of the game Fallout 4, but despite counting myself among the series' legion of fans, I'm not going to be able to play it. It's not because of a lack of time or money; it's due to a recent environmental change which I hadn't accounted for.

I moved house.

My new neighbourhood

I'm living in a deeply rural area, which has a beauty offset only by some of its inconveniences. There's zero mobile signal, so I often find myself battling nomophobia (occasionally walking to the next village, just to see if anybody's tried to call!) but even its wired services are lagging a decade behind the times. This is the result of my internet speed test:

Speed test results
"Slower than 94% of Great Britain" - ouch

Even 10 years ago, this wouldn't have been a big deal. Games were almost entirely bought over the counter, patches were a rarity and games would generally function fine without them. But today, the PC games market is largely digital, and even with consoles, there's an assumption that you're always connected to the internet and won't mind the occasional impromptu multi-gigabyte update.

Which brings me back to Fallout 4. The digital version of the game is a 23.8GB download, and a quick calculation suggests that it'll take 72 hours to transfer using my connection. That's before we get to the "day one patch", the now-commonplace technique which allows developers to continue working on bugfixes and improvements after they've locked-down the game for manufacturing. For most triple-A titles, the day one patch is in excess of 10GB. So, even if I bought the physical version of the game, that's about 30 hours of downloading an initial patch before I'd be allowed to play the game that I'd just bought.

And that's if the disk-based version of the game actually contains the game code at all. Just recently, the PC release of Metal Gear Solid V caused consternation when its disk was found to contain only 8MB of data - a copy of the Steam installer!

So, right now my hopes of playing Fallout 4 are on-hold, at least until I can fathom how to drag my PC or console to a publically-available, uncapped, high-speed internet connection...

The two-tier system of mobile internet access

My home internet situation is admittedly rare these days, and is largely a problem of my own making. However, there's a much larger market of people who are struggling with connection speeds: people who are trying to access the web on their mobile devices.

In the past year, mobile app and mobile web usage has overtaken desktop for the first time. If you're one of these mobile converts, you'll probably recognise two distinct tiers of mobile internet usage: the stuff you do when you're connected over wi-fi and 4G (video browsing, uploading/downloading/syncing) and the stuff you do when you're anywhere else (often waiting 20-30 seconds for a simple webpage to load, and encountering timeout errors).

This latter usage is surprisingly common. In the UK, 4G coverage is still exceedingly rare outside cities, and public wi-fi is often throttled to prevent misuse. Even when I visited San Francisco earlier this year, I was astonished to find there was almost no publically-available wi-fi, as their over-the-air networks are so good for locals (roaming tourists be damned!) that there's not deemed to be much need for it.

As web developers, we need to be aware that a large and diverse cross-section of our audience will be accessing our content under less-than-ideal conditions. It's sometimes hard to picture when we're sitting in plush chairs in our air-conditioned office with a fibre connection, but if we're trying to understand our users' experience, chances are that it's very different to how we're consuming it ourselves.

Those of us who've been in the business since the first dotcom bubble (and boy does that make me feel old) will recall that this used to be one of the most precious tenets of web development. Back when even your most privileged users were accessing your content via a 56kbps dial-up connection, things like page size mattered. Yet many people's mobile internet streams are still of a comparable speed - and let's not forget a significant portion of the developing world is still actually using 56K modems to get online. (If you've forgotten what that's like, read Andrew Spaulding's recent article I used a 56K modem for a week and it was Hell on Earth.)

Back in the late-90s, every byte mattered. JPEG compression and other optimisation techniques were applied to ensure that image filesizes were as small as humanly possibly whilst still retaining comprehensibility. Waste was a sin. We even had low-bandwidth, "text-only" versions of our websites. Why did this ever change?

Complacency plays a large factor. I see this from designers all the time. They'll produce site mockups and assets on their fancy widescreen retina displays, producing an eye-bleedingly beautiful design which unfortunately fails to scale elegantly to less than about 1600px width. Similarly with the aforementioned image sizes: I recently tested a site which included a 4000px-wide, multi-megabyte image of a post-it note, using client-side resizing to render it with more appropriate dimensions. (Their justification? "But if they've got a 4000px-wide screen, it'll look really nice." - it's a post-it, for heaven's sake!)

Ironically, frameworks which have been created with responsive design in mind can often work against us. Drupal, for example, will preload assets in the page header before it decides whether it needs to display them. You might well have configured your site to have a fancy 3MB header image for desktop users, and thought to suppress that image for mobile/tablet users. However, if you're not careful, you'll find that the site still loads that 3MB of data on mobile, even though it ultimately doesn't render it.

What can testers do about it?

This is going to be the topic for this month's Weekend Testing Europe session. As testers, what can we do to recognise and highlight these problems before they get into the wild? We'll be looking at a variety of aspects:

  • Discussion about being a proxy for your users: they're not going to notice there's a problem until it's too late.
  • What tools can we use to simulate the experience of mobile users, and users on slow connections?
  • ...or do we need to simulate it at all? Surely we all know of a local internet blackspot where the connection is appalling - when I worked in London, "testing on the train" was a real and useful activity!
  • As a development team, what targets/expectations can we set for load times and page sizes (and do we understand our users enough to set these)?
  • Does our architecture support low-bandwidth users - are we doing responsive design well, and are we using caching etc as effectively as we can?
  • How can we get product teams to care about this - can they recognise the problem when they see it?

Please join us next Sunday (3.30pm GMT) to discuss this further, or leave a comment below if you have any thoughts!


Viewing all articles
Browse latest Browse all 18

Latest Images

Trending Articles





Latest Images