📅 Posted 2020-08-11
The web is rad. Well, it has been for a long time. But maybe the web is changing. As a user of a web browser, who spends a lot of time reading websites and publishing things on the web (maybe that few people read, ha!), these are my thoughts about the web as it is today.
I’ve been mucking around with the web since about 1996, when I built my first website with too much liberal use of Photoshop airbrush and a bit of Notepad. Some say I might’ve discovered Dreamweaver along the way too, but it wasn’t nearly as much fun as Netscape Communicator and super chisel-edged tables. The day I discovered server-side includes made the potential of avoiding frames in order to share a menu from one page to another was suddenly exciting! I’m in no way a thought leader in this space, but I can dabble with HTML, CSS, JS, random HTTP headers, the odd DNS record and how to build a serverless CMS using AWS and Hugo.
Things I like
I think Tim Berners-Lee got so much right with his original vision for the web. Things like Universality. This means people can put things online using whatever technology they’ve got. Accessibility concerns are addressed. Differing browsers, internet connections, devices, short, long-form, comedy, serious, it doesn’t matter. Things can be linked together, really easily. That’s one of the reasons why URIs are so important. So stop trying to hide them because they look “technical” or something. OK, maybe the double slash was a poor choice, but that’s hindsight for you.
Open standards are so important too. It’s like everyone can mostly agree on a language in which to communicate and integrate solutions. RSS, JSON-LD, all very important things that keep the web alive and ensure content can be syndicated, notified, shared, indexed and of course, consumed.
I remember checking out the website of a guy from Israel who uploaded photos of his lego models in the 90’s. That was a great site. It probably looked blindingly awful by today’s design standards, but that’s missing the point. He was able to publish his interests on a niche website. We need more of that.
I’ve been toying with the idea of a building search engine which indexes content from niche publishers. Like this site, or that site. Content written by boffins about the things that matter to them. It would be interesting to train a model to recognise these kinds of sites, so that you can be guaranteed it’s a decent site and not an SEO link farm. In the meantime, I’ll just continue to pick out cool reads from Hacker News.
Thinking about spinning up a new website?
You get a lot “for free” by developing a site or app that targets modern web browsers. I mention web browsers a lot, because the web browser is the primary way most people experience the web. I’m going to ignore the earlier weird web created by the likes of AOL. Who ever used all of those ‘free’ CDs, anyway? The web browser what most people are familiar with. I say “modern” because, well, IE always stinked and nobody liked it anyway.
How about native clipboard. I mention this because of the recent ‘notifications’ that iOS14 will add when apps sneakily read the clipboard and for most apps it seems to be the inclusion of certain SDKs which scrape the clipboard, but this is shocking to me to even consider it was possible. Can you imagine if you could have your clipboard scraped simply by just navigating from one site to another? I understand the permissions model is not ideal, but it’s good that Apple are working on improving it. Let’s hope Android follows suit.
How about native printing, including export to PDF. Add some special CSS to improve the printing experience and you’re away and running. Sometimes it looks pretty bad (I’m looking at you, Confluence) but with a bit of care and attention, printable versions of websites aren’t that difficult. And I say printable, I mean not that many people are interested in printing (thankfully) but printing also means saving to PDF. Lovely. Speaking of which, I haven’t bothered with checking this site, oops!
What else can we do?
- Saving entire webpage contents (not applicable for SPA’s) for offnet browsing
- Easy to inspect code, using dev tools, although I feel like the obfuscation from React is working pretty hard to undermine this!
- Easy to debug networking issues and data being sent/received without having to use MITM-style proxy tools
- Single codebase which is cross platform / cross device without having to dip into cross-platform mobile development tools (they never did work that well back in the day, are they any good now?)
- Browser allows for plugins, add-ins, Greasemonkey scripts, bookmarkets… lots of customisation opportunities
- Easy to inspect local storage, caching, local data
- Native in-page search - just press CTRL+F!
- Select to copy text, paste elsewhere. Sites that attempt to block certain actions like the right-mouse-click are laughable, but I guess it works on some people
- Bookmarks, so you can come back to where you were previously
- Sharing pages is so easy, just copy the URL. No need to write a ‘share’ utility (conversely, AMP/Google are actively trying to diminish the value of the URL)
- Search engines index your content - who needs to write special deep linking or handling logic? You get that automatically if you’re doing URIs right.
- Easy to push out updates, people always stay on the “latest version” of your site and code
- User can control pan/zoom themselves, unless of course you’re an idiot and you ban this for the portion of the audience who don’t know how to force enable zoom
- Faster to load a page for a site not recently visited than to cold boot an app - try it! Bandcamp I’m looking at you.
- Walled garden? Can be great if the wall provides privacy protections, not so great if the garden is limited by a company not in your control and without your best interests at heart. But that wouldn’t be any modern company with a major source of revenue from data and advertising, now would it?
- Old sites “still work” (static HTML) - just look at Space Jam (unfortunately timing out as I write this!) a website that has been up and down but demonstrates a lovely 1996 web aesthetic!
- You can easily save images if you want to, despite all of the copyright challenges. I think people need to get used to the fact that once it’s online, it’s out there, in the ether, for good!
- The modern browser has a very mature web API: video, audio, storage, DRM, crypto, push notifications, location… sure sometimes compatibility is hard having just fixed a
.sort()difference between modern Chrome and Firefox browsers, but that’s why you have Can I Use and proper cross-platform testing
- Low barrier of entry to ‘surf’ to a new site: just navigate and wait for the load/render, you don’t need to download a giant binary and install it locally to view content or try out the app experience. Funnily enough despite faster computers and connections, sites seem to load at much the same speed as they did in the 90s because we’ve overloaded them with too much adtech and dark patterns!
- Easy to link internally and externally to explore sites and content - goes back to the magic URI once again of course. I’ve been guilting of changing URIs radically over the years. Sometimes I’ve left a trail of 301s and other times not. But I’m learning!
- Web browsers are also kinda interesting on mobile devices, they’re almost like a sandbox in a sandbox. I feel like there is an extra layer of sandboxing between your device’s capabilities (eg various sensors) and the website you’re using.
The web is at risk
This is one of the reasons why I’m writing this post. The other reasons are mostly laid in Why I have a blog.
URLs are being removed, hidden, truncated and devalued.
Chrome seems to be the most guilty party of this. The fact that Chrome intentionally hides it and converts the location bar to a search box on mobile is 110% guaranteed to work in favour of Google and it’s giant search engine crushing all the things behaviour.
An extension to this is content is being hosted on a domain that doesn’t indicate the publisher despite being branded and designed to look like the publisher’s real website. I am talking about AMP of course here. While I like the idea of making the web lighter again (to add speed, add lightness, thanks Lotus), the AMP Project seems to be geared around distracting attention away from some of the other, more basic, principles of the web. Sure, having a set of go-fast principles is great, but I can’t help but be cynical as to why AMP needs to be pushed so much by Google (I’ve been in meetings which I would describe as corporate bullying) far beyond the optimisation of the user experience.
Let’s talk about search a bit more, too.
Amazing search engine experiences are diminishing the availability of unique and long tail content.
Search indexes are crafted around the most popular content, but not always the best content for a particular search. Don’t get me wrong, I’m a big fan of using data to drive better results (one of my experiences is in delivering a real-time content recommendation engine for the ABC).
No guesses at who this comment is aimed at. I do like the progress we’ve made since the Alta Vista, Excite, Yahoo, etc. days, but it’s not without issue. There seems to be a constant loss of precision when it comes to matching certain search terms. Yes, I can add pluses and quotes around things, but often the results don’t mention what I’m looking for and aren’t even close to being relative.
I rely on being in the major search engines for pretty much all of my traffic. So I’m beating the big guy who actually sends traffic my way. But I think the big guy can do better for humanity.
Speaking of which, playing the SEO game is fierce and unfortunately, the sponsored junk blogs are in the lead most of the time.
I try to do my best with adding in all the recommended practices and I think the sites I’ve produced are OK for what they need to be, but I do know there is always a cat and mouse game with people trying to game the system and large search engine giants trying to ungame the system. This report, for example, isn’t a bad guide for free and I’ve doubled the monthly visitors on Hawkesbury People and Places simply by following a few basic tips.
What I’m most concerned about, however, is that more and more, content is being hidden inside walled platforms called “social networks”. There is a massive incentive for these companies to retain audiences within their walled gardens. Really, they are just parallel universes vying for the same metric as the rest of the web - eyeballs (and I guess ear canals in the podcast and streaming worlds). This will be the end of the “web” if content isn’t more accessible via search and a basic web browser.
More things I like
I like the 90’s web. And I’m not talking about the design (although it’s kinda nice - I’ve had 2 comments this week about how 90’s my site looks - I took that as a compliment and I kinda like how modern the ‘serverless’ and CDN-based backend is despite looking a bit ‘retro’). Here’s my button to advertise my fake company from 2000, inspired by the “Netscape Now” buttons of the time:
It was the golden age of wonderment and “underconstruction” GIFs.
I’m talking about the time when every niche interest was carefully published on free hosting websites, adorned with animated “under construction” signs and really bad tiled backgrounds. It was a fun time.
I remember attempting to host a site on my (temporary) dialup connection and went into school to open it on the computers there and to be amazed that it worked and my 33.6k connection stayed up! A claim to fame was a month long phone call with my ISP, which caused issues with their “number of hours used this month” since the session both started and finished in months outside of the billing period.
As you can see, many others have also written on this topic, even with the same article title. I’m just a quiet voice over here in my corner of the internet, and I’m hoping to continue to be able to publish content in this way for as long as people will read it.