46 min

Breaking the web in 2014 - 2016

More site owners are offending the web with their effed-up, bloated javascript-heavy constructions.

These ill-conceived content sites do not provide users with a simple and enjoyable reading experience.

Site developers misuse modern browser technologies in order to show off their alleged technical prowess to their like-minded, dorky web offenders.

The other problem, especially for professional content media sites, are the number of trackers and other gobbledygook that gets downloaded to a user's browser, which severely slows down the page load time.

Many so-called new-and-improved web designs bog down older desktop/laptop computers. The JavaScript, the trackers, etc. devour the older CPUs. It's suppose to be a web page with content and not a video game.

We might be better off if we designed sites with minimal HTML and minimal responsive design, creating what might be called responsible design.

A few suggestions:

  • Try designing the site without javascript, or test the site with javascript disabled, and ensure that the site still works with progressive enhancement.
  • Limit the use of giant images.
  • Make every image count. Don't use an image just because it looks cool when it has nothing to do with the rest of the content.
  • Don't break the back button.
  • Don't break the Ctrl-C and Ctrl-V for copy and paste for desktop/laptop users.
  • Don't create confusing and unfamiliar click/touch actions on links and navigation.
  • If you can't control yourself, then create a frigging native app instead of blanking up the web.

This might make web development boring, but the focus, however, should be on the content and not nifty animations.

Maybe this is how content providers can charge a fee for their content by offering the plain, simple, usable version of their site for a price. The free users get deluged with the bloated, ad-heavy, so-called sophisticated, modern version.

I understand that geeks like to incorporate their new skills into new and existing projects, but the hip tech-of-the-day should still provide value to the end users.

Too much of this new stuff seems like a solution in need of a problem. Geeks cannot find enough problems to solve with the new tech, so they create new problems by unnecessarily gumming up the works on web sites that did not have problems.

The web experience is becoming increasingly frustrating. Sites that once worked fine by my standards are now becoming so annoying that I may stop reading them, or I'll read their content only if it's provided in an RSS feed that I can view within my site here.

On my older laptop, I use Firefox with the NoScript plug-in, so that by default, I view every website with JavaScript disabled. This speeds up page load time dramatically.

Through the NoScript plug-in, I can enable some or all scripts for the page or the entire site either temporarily or indefinitely. The control is more with me.

For sites that fail to work without JavaScript, I will either enable JS if I like the site, as with Medium.com, or I simply move on, since it's the World Wide Web.

On mobile, however, I primarily use the Safari browser on my iPhone with JS enabled. By mobile, I mean my phone, since I no longer use a tablet. I read for long periods on my iPhone.

Well-designed sites are typically responsively-designed, so they function fine on my smartphone, although I wish designers would trend toward a larger font size for the phone.

In recent years, the font size and line-height have increased for the desktop/laptop versions of websites. But in my opinion, some responsively-designed sites use a font size that is too small for the phone.

When viewing some websites on my laptop, I resize my browser to get the "mobile" version of the responsively-designed site because it functions and looks better than the full-size version.

On my phone, I have little patience for web sites that are not responsively-designed. It's nearly 2015, and thankfully, I'm encountering non-responsive sites less often.

I have no idea why news.ycombinator.com refuses to make the slight changes to enable the site to display well on a smartphone. Among sites that I visit regularly, Hacker News is about the only non-responsively-designed site that I tolerate on the phone.

Jan 2015 update



Bloated CNN design. It took so long to load on my iPhone 5 that I gave up.

Feb 2015 Update




Mar 2015

Pinterest website is a web abuser while simply viewing the site as a browsing-only user.

Twitter's website is a web abuser on the desktop/lapop and on the phone. Unacceptable back-button usage. Can't open page in the background on the phone. Can't copy text with JavaScript disabled. Infuriating.

Hilariously stupid that such huge properties fail at basic web design that existed more than 20 years ago. No matter. I don't need these properties. It's a "world wide" web.

This web page about tablet weaving is superior in design compared to what Twitter and Pinterest creates. That's because this tablet weaving page is more aligned with the true spirit of the web.





HN comments:

> "If it doesn't load through curl, it's broken." --someone

So, so true. Thanks, curl.


That's pretty much my own test for a Web based API - if I can drive it from the command line using curl then great, if I can't then it's broken.


I wasn't saying not to do the fancy stuff but rather to start with something which degrades well and then have your JavaScript enhance that basic experience. If you want to know why this is a good idea, you should start using something like getsentry.com or errorception.com to record your JavaScript errors.


I've been using websites since the early 90s and this pro-single-page sentiment is getting really tiresome. You are breaking the web. You are destroying users' security. Sure, there are plenty of reasons to use JavaScript, and plenty of places where it's appropriate. It probably is a good idea for games and so forth. But requiring users to load and execute constantly-changing code from across the web in order to read a page or submit a form is in-friggin-sane.

Some one else pointed out that it'd be nice if browsers offered more support for things that certain types of developers clearly want to do. I completely agree; it'd definitely be nice to take advantage of many of the technologies which currently exist to do more, in a more structured way. But requiring code execution in order to read data is madness.



Apr 2015

I tried to read a story at usatoday.com, using the Firefox browser with the NoScripts plug-in. Even with everything temporarily enabled for the page/site, the site functions horribly. It's an appalling UI/UX. Wow. People get paid to produce web-abusive sites. A 1995-designed site would also be superior to this usatoday.com train wreck. Amazing.

HN Thread : Please stop making infinite scrolling websites

I'm not a fan of the infinite scroll. Depending upon its implementation, it provides a clunky and confusing user experience, and the back button can be broken because clicking away and then going back may place the user at the top of the site. That's annoying after scrolling down several "pages."

Some services make more sense to use infinite scrolling, but I think that most of the time, the simple "Older" and "Newer" links work fine.

Apr 28, 2015

More web abuse.


Random question: How do I stop videos from auto starting on Bloomberg? I'm running Safari with no Flash, and have Ad-block on. Video doesn't start, but audio does. Super Annoying.


It's another web trend I don't understand. All news websites seem to do it, it's super obnoxious, and I don't know anybody who doesn't just rush to click stop as soon as they click the page.


What works for me in Chrome is to disable plugins by default. It seems to work universally, including for Bloomberg.

It's amazing and tremendously annoying how many abusive websites launch a new browser tab when I click a link. Frigging morons. If I wanted to launch the page under the link in a new tab, then I would right click on the laptop or open in the background on the phone.

It's equally annoying and maybe worse when websites DISABLE the right click or open in the background option. These sites with their bloated, silly-ass JavaScript implementation are Grade-A web abusers.

http://qz.com - stunningly abusive when JavaScript is disabled. For their notes, not their articles. Clicks don't work at all in any fashion. On the laptop, I get the hand icon for a clickable link, but nothing works. Wow. This site should be added to some kind of watch list. Even with JS enabled, their notes work abnormally when clicked. I don't see the need for the fanciness. Article links work normally.

May 12, 2015

Another wretched, web-abusive pile of steaming crap:

On the phone, can't open an article page in the background with Safari.

After done reading an article and hitting the back button, the site places me back at the top of the site which is infuriating after I had scrolled down a long ways to read the article.

Unbelievable how people pay for this kind of development. Do they test it? How is this acceptable?

Revolting. Sites like this are harmful to the web.

May 26, 2015


July 2015








https://stratechery.com/2015/why-web-pages-suck/ - https://news.ycombinator.com/item?id=9891927

I think there's too much blame being placed on programmatic advertising. That's no excuse for 14MB pages, fixed position ads, trackers pinging the network for a full minute, etc.


John Gruber had strong words about Apple news site iMore:
I love iMore. I think they’re the best staff covering Apple today, and their content is great. But count me in with Nick Heer — their website is shit-ass. Rene Ritchie’s response acknowledges the problem, but a web page like that — Rene’s 537-word all-text response — should not weigh 14 MB.1.

It’s not just the download size, long initial page load time, and the ads that cover valuable screen real estate as fixed elements. The fact that these JavaScript trackers hit the network for a full-minute after the page has completed loaded is downright criminal. Advertising should have minimal effect on page load times and device battery life. Advertising should be respectful of the user’s time, attention, and battery life. The industry has gluttonously gone the other way. iMore is not the exception — they’re the norm. 10+ MB page sizes, minute-long network access, third-party networks tracking you across unrelated websites — those things are all par for the course today, even when serving pages to mobile devices. Even on a site like iMore, staffed by good people who truly have deep respect for their readers.


Ghostery, NoScript, or JavaScript disabled for select domains or for all domains helps speed up the web.

https://news.ycombinator.com/item?id=9897306 -- http://developer.telerik.com/featured/the-webs-cruft-problem/





This is hilarious and extremely ironic: http://www.theverge.com/2015/7/20/9002721/the-mobile-web-sucks

I hate browsing the web on my phone. I do it all the time, of course — we all do. But man, the web browsers on phones are terrible. They are an abomination of bad user experience, poor performance, and overall disdain for the open web that kicked off the modern tech revolution.

I disagree. The problem is not with the mobile web browsers. The problem is with the WEB SITES.

Web sites are an "abomination of bad user experience, poor performance, and overall disdain for the open web."

And the Verge.com is one example of a web-abusive site. It's home page is horribly slow-loading thanks to way too many useless images and probably javascript. With javascript disabled, the site's home page loads significantly faster.

These bloated websites require users to have brand new computers with the latest, fastest CPUs.

Mobile Safari on my iPhone 6 Plus is a slow, buggy, crashy affair, starved for the phone's paltry 1GB of memory and unable to rotate from portrait to landscape without suffering an emotional crisis.

I've never had remotely close to those problems in the 12 months that I've been using my iPhone 5C. I'm still using iOS 7.

Chrome on my various Android devices feels entirely outclassed at times, a country mouse lost in the big city, waiting to be mugged by the first remnant ad with a redirect loop and something to prove.

Um, okay. This person is not a writer.

And I've not had an issues with the Chrome browser on my iPhone. I like the fact that the browser has defaulted to the fast, smooth scroll when viewing websites. Maybe this will be the default for all mobile browsers someday. Then we'll have no need to design a website with the special CSS to make fast, smooth scroll occur. That CSS munges up other functionality within a mobile browser, like having the top and bottom sections of the browser disappear or shrink when scrolling.

Granted, this is only theverge.com, and maybe that's why this article lacks intelligent thinking.

The overall state of the mobile web is so bad that tech companies have convinced media companies to publish on alternative platforms designed for better performance on phones.

It's not because of poor mobile browsers and poor phone hardware. It's because of horribly designed websites by media orgs.

So typical. A media company blames someone else.

Way down in that lengthy article, the writer finally states something intelligent.

Now, I happen to work at a media company, and I happen to run a website that can be bloated and slow. Some of this is our fault: The Verge is ultra-complicated, we have huge images, and we serve ads from our own direct sales and a variety of programmatic networks. Our video player is annoying.

We could do a lot of things to make our site load faster, and we're doing them.

Finally, admitting, in a round-about, back-handed way, that it's the media company's fault. And I would say it's 100 percent the media company's fault.

Yet ...

But we can't fix the performance of Mobile Safari.

The writer or theverge.com should design that article page with bare-minimum html, 1995-style, and then load it as a static page and test the load speed on mobile Safari.

Add a meta tag with the viewpoint attribute to make the page read better on the phone. And then add a tiny CSS page with a little formatting and maybe a font-family load and a media query. But keep it focused on something useful.

And test that page load time.

Oh, no JavaScript. Don't need it for a user who is only reading the page.


Jul 21, 2015 tweet

The Verge article blaming browsers for a shitty mobile web is 6MB and has more than 1,000 javascript errors.

Yes, that verge.com article is one of the dumbest things that I have read in 2015.

In my opinion, 100 percent of the blame goes to website developers for creating messes. It's not the fault of the mobile web nor mobile web browsers.

This page loads fine and reads okay on the mobile web. It would read better if the viewpoint attribute was used within a meta tag to make the text display larger on a mobile browser. But as it is, it's still very readable when holding the phone in portrait mode, and the text is larger when the phone is held in landscape mode.

The default font size for the above site when holding the phone in portrait mode is about the same size as many NEW responsively-designed websites that for some idiotic reason use a tiny font size.

It's amazing that RWD websites suck on the mobile web. Not everyone has perfect vision. Not everyone wants to read a ton of tiny text on each line. We don't mind vertical scrolling. Make the font size bigger.

I don't know why many website developers/designers use a smaller font size in their media queries as the screen size gets smaller.

In most cases, I use the same font size for a 320 pixel wide screen as I do for a screen that's larger than 1024 pixels. And sometimes, I increase the font size as the screen gets smaller.

For http://babyutoledo.com/ I use a larger font size than I normally use for both desktop and phone. Sometimes, I think the font size is too big, but on the phone, I like it. Good line-height, good spacing between paragraphs. It's a comfortable reading experience on my iPhone 5c when holding the phone in portrait mode.

It seems that for many web developers, "user comfort" is unimportant.
















Sep 2015


As usual, most laypeople (and even designers) aren’t very specific in evaluating visual design. If I were to give you quick bullet-points from my free design course, I’d say:
  • Just use one (quality, appropriate) font
  • Use just a few font sizes
  • Use one dominant color, while keeping the rest black, white, and gray
  • Spend more thought on your white space than anything else

Oct 2015

Gee, what a shock that bloated, slow-loading websites that get trimmed end up loading faster. It's not only annoying ads. Simply disable JavaScript, which reduces the desired function of the site, but such an action increases the speed of the site.

"Tests of top 50 news sites with three ad-blockers on iPhone show significant decrease in load times for many sites, modest increase in battery life"



What's sad and somewhat bizarre is that people are surprised at the page load speed of a single article page when ads and JavaScript are disabled.

Better late than never in discovering their sites' UX problem.



And I didn't write this HN comment, posted on Oct 15, 2015:

If you have an engineering mind and care about such things - you care about complexity. Even if you don't - user experience matters to everyone.

Have you ever seen something completely insane and everyone around doesn't seem to recognize how awful it really is. That is the web of today. 60-80 requests? 1MB+ single pages?

Your functionality, I don't care if its Facebook - does not need that much. It is not necessary. When broadband came on the scene, everyone started to ignore it, just like GBs of memory made people forget about conservation.

The fact that there isn't a daily drumbeat about how bloated, how needlessly complex, how ridicuous most of the world's web appliactions of today really are - baffles me.

I disagree with this HN comment:

The real problem of web development is JavaScript. It’s a relic of the past that hasn’t caught up with times and we end up with half-assed hacks that don’t address the real problem. We need something faster and way more elegant.

I'm writing this from within the JavaScript editor that I borrowed in the summer of 2013 and then hacked to meet my requirements. I have installed versions of this editor in my Junco (powering this site), Grebe, Scaup, and Veery web publishing apps. I can use it easily on my phone. I write fast with it. It works for me and my web writing purposes. For this, I LOVE JavaScript.

JavaScript is not the real problem with web development. It's overuse by designers and developers is the problem. It's used when it's not really necessary, in my opinion, and that bogs down page load times.

http://toledowinter.com uses no JavaScript for the browsing user. When I log in, and when I want to edit with what I call the "enhanced" editor, then I get JavaScript.

JavaScript can be perfectly fine for the logged-in user's tools and dashboard.

not enough words to exist that can explain how bad this site is designed.



Designer News comments:

Main reason it is faster than most news sites, beyond all this excellent work: they not beholden to 3rd-party advertising tech.

another comment:

The async javascript loading is something I still haven't tried yet, but I'm sure would really help the sites we're building.

From the article:

Already one of the fastest websites in the news industry, NPR.org now loads twice as fast as it did previously, furthering public radio's commitment to mobile audiences.

2. Load as much JavaScript asynchronously as possible. Most CSS, and all synchronously-loaded JavaScript, needs to be loaded and interpreted by the browser before a website finishes loading. JavaScript assets that don't affect the initial rendering of your website can be loaded asynchronously by using the async attribute or a JavaScript file loader like RequireJS.

3. Optimize image assets. When developing a responsive website, it's important to be mindful of how much of your users' bandwidth your site will require to load, and it's not unusual for images to be among the heaviest assets on your site. [what? get outta here.]

5. Measure constantly. There are lots of tools available to developers to help identify areas ripe for performance improvement, including PageSpeed and YSlow, and they're tremendously useful.

7. Take testing seriously. Developers wrote unit tests as they worked, and the full NPR.org team began 15-minute, at-your-desk testing sessions a month before launch. In the final two weeks, the team gathered for highly structured, extended sessions. We held separate sessions for mobile and desktop testing.








"Really all I’m saying is don’t build a SPA. A SPA will lock you into a framework that has the shelf life of a hamster dump. When you think you need a SPA, just stop thinking."


The world wants Single Page Apps (SPAs), meaning we have to move huge amounts of logic from the server to the browser. We’ve been doing this for years, but in 2015, we’ve found better ways to build these large sprawling front end apps.

Eewww. Maybe the world wants native apps. Why not simply build native apps?

Are these SPAs used for internal web apps at companies to handle perform tasks by logged-in users? If so, then okey-dokey.

"... we’ve found better ways to build these large sprawling front end apps."

Great. Saddle users' devices with large, sprawling front-end apps. If these piles of steaming poop are used to display text-based content to non-logged-in users, then why?

If the user-experience is improved, then the SPA is a success.

If the user-experience is diminished by a bloated, sluggish, clunky web site, then the SPA is a massive failure. Go back to 1995.


https://www.baldurbjarnason.com/notes/media-websites-vs-facebook/ - Fantastic post - i stumbled upon this on Jan 22, 2016.

“The web doesn’t suck. Your websites suck. All of your websites suck. You destroy basic usability by hijacking the scrollbar. You take native functionality (scrolling, selection, links, loading) that is fast and efficient and you rewrite it with ‘cutting edge’ javascript toolkits and frameworks so that it is slow and buggy and broken. You balloon your websites with megabytes of cruft. You ignore best practices. You take something that works and is complementary to your business and turn it into a liability.”

—Facebook and the media: united, they attack the web


Apparently, the web is broken and slow despite the fact that the apps are using the same infrastructure and standards as the web. Guess how those Instant Articles are formatted? HTML. Guess how those articles get to the app? HTTP.
Those two techie terms should sound familiar to you.

Even the web’s old guard is worried. The web can’t compete. The web can’t compete. The web can’t compete. End times.



baldurbjarnason.com references the above links and then writes:

There’s just one problem with this. It’s completely untrue. Here’s an absolute fact that all of these reporters, columnists, and media pundits need to get into their heads:

The web doesn’t suck. Your websites suck.

And continues with the comment above.

The web is slow???? No. Wrong. Websites seem slow because publishers force users to download megabytes of useless crap. HTTP is not slow. Simple pages load almost instantly.

More ...

The lousy performance of your websites becomes a defensive moat around Facebook.
Of course, Facebook might still win even if you all had awesome websites, but you can’t even begin to compete with it until you fix the foundation of your business.

If your web developers are telling you that a website delivering hypertext and images can’t be just as fast as a native app (albeit behaving in different ways) then you should fire them.

Peter-Paul Koch, web browser tester extraordinaire, picks up on the phrase that I highlighted in the John Gruber quote and runs with it.

The web definitely has a speed problem due to over-design and the junkyard of tools people feel they have to include on every single web page. However, I don’t agree that the web has an inherent slowness. The articles for the new Facebook feature will be sent over exactly the same connection as web pages. However, the web versions of the articles have an extra layer of cruft attached to them, and that’s what makes the web slow to load. The speed problem is not inherent to the web; it’s a consequence of what passes for modern web development. Remove the cruft and we can compete again.

Tools don’t solve the web’s problems, they ARE the problem by Peter-Paul Koch (841 words).

The web is capable of impressive performance and offers a dramatically vaster reach and accessibility than anything an app can offer.

This is a long-standing debate. Except it’s only long-standing among web developers. Columnists, managers, pundits, and journalists seem to have no interest in understanding the technical foundation of their livelihoods. Instead they are content with assuming that Facebook can somehow magically render HTML over HTTP faster than anybody else and there is nothing anybody can do to make their crap scroll-jacking websites faster. They buy into the myth that the web is incapable of delivering on its core capabilities: delivering hypertext and images quickly to a diverse and connected readership.

We continue to have this problem because your web developers are treating the web like an app platform when your very business hinges on it being a quick, lightweight media platform with a worldwide reach.

Great piece because it sounds like what I've been saying for a while. It destroys the feeble-thinking expressed in this article http://www.theverge.com/2015/7/20/9002721/the-mobile-web-sucks


The mobile web doesn’t suck. Mobile web browsers don't suck. Your websites suck.

Build websites. Don't try to build native app sites. If you want native app functionality, then build a native app. Quit trying to make websites act like native apps.



“The mobile web is not a technical thing – it is a misconception and one that is hard to match. Companies nowadays start with a native experience. This is where the short-term gain is. This is where the short-term high user numbers lie. This is the beautiful technology – the shiny devices, the backing from OS vendors and the exciting interaction models that come with the OS. It feels good, it feels that there is something to work with. Then these people hear about the benefits of web technologies: cross-platform support, lower cost of engineering as you don’t need to pay experts of red-hot new technologies, re-use across different form factors and so on. A lot of times this happens when the honeymoon period of the native solution is over and there is new growth to be unearthed.”

—– Christian Heilmann



“This is why I would work hard to avoid any “Us vs Them” rhetoric. Much the opposite, I would argue all developers should aim to achieve a combination of engineering and craftsmanship. Engineers should strive for creativity, think out of the box, break rules when needed or better, know when to compromise on purity and keep in touch with the end value at all times. Craftsmen should embrace higher levels of abstraction, and they should aim for maintainability, DRY and learning new patterns and languages, which will only give them more power to express themselves and to create efficiently.”

—Sébastien Cevey

http://bradfrost.com/blog/post/this-is-an-updated-website/ - Posted on 11.06.12 -

Oh, and Javascript on the site? There is none. That will probably change, but I think that’s pretty crazy.

Feb 2016


found this older post:


March 2016


One of the best posts that I have seen. At least a few of us think this way.

"Do web developers actually use web browsers?"

And thankfully, another person recognizes the wretchedness in Twitter's web design. It has to be the worst web design for a company with such a high Wall Street value. But most Twitter users probably access the service via a desktop/laptop app or a mobile app. Maybe the web version of Twitter is designed to be incredibly clunky to encourage more people to download an app.

About Twitter:

See, here is a screenshot of a tweet, with all of the parts that do not work without JavaScript highlighted in red.

That × button at the top right, and all the empty surrounding space? All they do is take you to my profile, which is shown in a skeletal form behind the tweet. They could just as well be regular links, like the “previous” and “next” links on the sides. But they’re not, so they don’t work without JavaScript.

That little graph button, for analytics? All it does is load another page in a faux popup with an iframe. It could just as well be a regular link that gets turned into a popup by script. But it’s not, so it doesn’t work without JavaScript.

The text box? Surely, that’s just a text box. But if you click in it before the JavaScript runs, the box is still awkwardly populated with “Reply to @eevee”. And when the script does run, it erases anything you’ve typed and replaces it with “Reply to @eevee” again, except now the “@eevee” is blue instead of gray.

That happens on Twitter’s search page, too, which is extra weird because there’s no text in the search box! If you start typing before scripts have finished running, they’ll just erase whatever you typed. Not even to replace it with homegrown placeholder text or apply custom styling. For no apparent reason at all.

About other sites:

I also use NoScript, so I’ve seen some other bizarre decisions leak through on sites I’ve visited for the first time. Blank white pages are common, of course. For quite a while, articles on Time’s site loaded perfectly fine without script, except that they wouldn’t scroll — the entire page had a overflow: hidden; that was removed by script for reasons I can’t begin to fathom. Vox articles also load fine, except that every image is preceded by an entire screen height’s worth of empty space. Some particularly bad enterprise sites are a mess of overlapping blocks of text; I guess they gave up on CSS and implemented their layout in JavaScript.

There’s no good reason for any of this. These aren’t cutting-edge interactive applications; they’re pages with text on them. We used to print those on paper, but as soon as we made the leap to computers, it became impossible to put words on a screen without executing several megabytes of custom junk?

I can almost hear the Hacker News comments now, about what a luddite I am for not thinking five paragraphs of static text need to be infested with a thousand lines of script. Well, let me say proactively: fuck all y’all. I think the Web is great, I think interactive dynamic stuff is great, and I think the progress we’ve made in the last decade is great. I also think it’s great that the Web is and always has been inherently customizable by users, and that I can use an extension that lets me decide ahead of time what an arbitrary site can run on my computer.

Yes, when I log into a web service, I expect to encounter an elegantly-designed, dynamic web interface that has been designed and developed by extremely talented people.

In my opinion, that happens when I log into my Digital Ocean account. The JavaScript helps make the experience smooth and easy. The JavaScript is not used to show-off. The JavaScript seems to be a background experience. The experience is smooth and maybe unnoticeable, which is even better. I log in, perform a task or two, and then exit. I'm not looking to be wowed by fancy tech.

The JavaScript should act like an offensive lineman that's doing a great job of run-blocking and protecting the quarterback, and the lineman goes unnoticed by fans and media.

When the JavaScript is misused in a show-offy fashion, then it becomes an obvious, annoying foreground experience. It becomes the offensive lineman who gets noticed by committing penalties and failing to block a rusher that crushes the QB.

More from eev.ee:

I’m not saying that genuine web apps like Google Maps shouldn’t exist — although even Google Maps had a script-free fallback for many years, until the current WebGL version! I’m saying that something has gone very wrong when basic features that already work in plain HTML suddenly no longer work without JavaScript. 40MB of JavaScript, in fact, according to about:memory — that’s live data, not download size. That might not sound like a lot (for a page dedicated to showing a 140-character message?), but it’s not uncommon for me to accumulate a dozen open Twitter tabs, and now I have half a gig dedicated solely to, at worst, 6KB of text.

Reinventing the square wheel

You really have to go out of your way to do this. I mean, if you want a link, you just do label and you are done.

This is what people mean when they harp on about “semantics” — that there’s useful information to be gleaned.

If I may offer some advice

Accept that sometimes, or for some people, your JavaScript will not work. Put some thought into what that means. Err on the side of basing your work on existing HTML mechanisms whenever you can. Maybe one day a year, get your whole dev team to disable JavaScript and try using your site. Commence weeping.

If you’re going to override or reimplement something that already exists, do some research on what the existing thing does first. You cannot possibly craft a good replacement without understanding the original. Ask around. Hell, just try pressing / before deciding to make it a shortcut.

Remember that for all the power the web affords you, the control is still ultimately in end user’s hands. The web is not a video game console; act accordingly. Keep your stuff modular. Design proactively around likely or common customizations. Maybe scale it down a bit once you hit 40MB of loaded script per page.



"Ads on news sites gobble up as much as 79% of users' mobile data"

That creates a slow web browsing experience, but according to the verge.com writer from the summer of 2015, the blame belongs to mobile web browsers.

One of the reasons consumers download mobile ad blockers is the impact ads have on their data plans. A report released Wednesday from Enders Analysis appears to back up that claim — at least when it comes to a sample of news websites.

March 2016


According to the HTTP Archive, the average top 1,000 web page is 2,123 KB, compared to 626 KB in 2010.

Images: 1253 KB v. 372 KB
JS: 425 KB v. 103 KB
CSS: 64KB v. 24KB

that's some pitiful bloat. and just think how it will get worse by 2020.



Please stop building websites that aren't responsive. Please stop assuming that I want to use your site on a desktop. Please stop serving up a mobile view that doesn't have the content I want and forcing me to try and read your small text on a 'desktop version'.

Please stop breaking the internet.



April 2016

https://brave.com/blogpost_4.html - "Brave's Response to the NAA: A Better Deal for Publishers"

The news industry is in catastrophic decline and has been for years.

busive, excessive, and even dangerous online advertising is driving users to adopt ad blockers en masse, and this is a trend that will not be reversed by legal threats, server-side anti-blocker countermeasures, or harsh language.

We note that malware has been distributed on the websites of the New York Times and the BBC recently through the ill-designed, unregulated, and poorly-delegated third-party advertising technology ecosystem.

In sum, and contrary to the misstatements of the NAA letter, Brave is the solution, not the problem, for users and publishers. We provide speed, privacy, protection from malware, and a new, anonymous payment model that helps the whole industry and publishers in particular, compared to the status quo.

The privacy point is overlooked in the NAA’s attack on Brave and worth emphasizing. The violation of individual privacy has reached epidemic proportions. The news industry has been an active participant in violating individual readers’ privacy by benefitting from non-consensual third party tracking and ads.

News industry leaders rightly decry the violation of privacy inherent in some NSA or FBI tactics, yet their own complicity in tracking individuals to even more invasive degrees is not addressed.

Furthermore, the NAA's letter misconstrues how Web standards and browsers work by design: the Web is a system that allows users to consume content in any combination and presentation that user-chosen software can achieve. Browsers do not "republish", copy, serve, syndicate, or distribute content across the Internet or to any computer other than the one on which they run.

Browsers do not just play back recorded pixels from the publishers’ sites. Browsers are rather the end-user agent that mediates and combines all the pieces of content, including third-party ads and first-party publisher news stories. Web content is published as HTML markup documents with the express intent of not specifying how that content is actually presented to the browser user. Browsers are free to ignore, rearrange, mash-up and otherwise make use of any content from any source.


Today, 17 member companies of the Newspaper Association of America sent a letter to Brave Software, Inc. notifying the company that its well-publicized plan to replace publishers' ads on the publishers' own websites and mobile applications with Brave's own advertising is blatantly illegal. The signatories of the letter represent more than 1,200 newspapers in the United States.

This might show how dumb media publishers are, and it might explain why the newspaper industry has failed to succeed in the consumer internet age.






IMO, the cited examples are minimalist but not brutalist.

Agreed. I think more along the lines of http://thin.npr.org/
.. which is great & I wish more sites had a "thin" version.

That is beautiful. As someone who tries to browse without JavaScript as much as possible, and so frequently sees screwed-up layout, I instinctively scrolled down to see the 'real' content before realising—no, there it is!

This is great! I had no idea this even existed, but I agree, it would be great if many more sites offered something like this. Especially when on really slow connections, this would come in very handy.

Yep, thin.npr.org is a great web design. Okay, it lacks the viewport to display nicer on a phone, but at least a reader can read the site in landscape mode, and the reader can zoom into the content to enlarge the text size. And the browser back buttons work properly with the site. Links are underlined. The thin site supports the open web better most sites. It uses minimal HTML. The pages are lightweight and fast-loading. A smidge of inline JavaScript exists: 4 lines with two lines being curly braces. No external JavaScript is loaded. No inline nor external CSS is used.

From: Dulles, VA - Chrome - Cable
4/22/2016, 10:20:01 PM
First View - Fully Loaded:
2 requests
2 KB

That's brilliant.

http://drudgereport.com was mentioned in that HN thread.

Virtually all of the text is underlined because virtually everything on that page is a link.

It dates from a time before it was considered "good design" and "usability" to hide the links from the readers

But the trend, it seems, is to go back to underlining links, at least within the body of an article. Navigation or menu links located at the top and bottom of the site may not be underlined, but the single-word nouns or verbs are normally obvious links. It's the links that exist within the body of an article that can be hard to notice if the text and link colors don't offer enough contrast.

Keep it simple and underline links with the body of an article.

Another HN commenter:

If everything's a link, then you hardly need special notation to call out the links from the non-links. It's not a conscious effort to make links more identifiable; it just didn't occur to the creator that you can do something with links other than underline them.

Ahh, but sometimes, the site contains regular text on its page, which means links must be underlined.

Another HN comment:

Just showed some of these to a generally non-tech-savy friend who said he didn't like them because they looked "too 90s." Personally I love them because they load fast, are easy to read, and don't require a knowledge of a bunch of different frameworks to write.

Another HN comment:

I have been fighting for years to get people used to "90s aesthetics."

It's even more important for web design. Give me simple HTML with a touch of css, and javascript only if it's absolutely necessary. I can think of hardly any websites that I would consider "beautiful" these days for exactly this reason.

PS: I'm not sure I would classify these sites as brutalist; perhaps 'utilitarian' or 'functional' would be better descriptors.

Wow. Right on. I did not post that comment, but it represents my thinking.


HN comment:

You can make something that doesn't require tons of frameworks and loads fast while NOT looking like a relic of the days of Kazaa. The fact that so many developers are too lazy to do so does not mean we should throw the baby out with the bathwater and go back to times new roman black-on-white.

True. Worthwhile CSS and JavaScript are fine.


Back to the brutalist HN thread:

I despise the trend for content minimalism. Where once there was a headline and sub, there's often now a cute little tile with picture and trimmed headline needing 20x the space to show the same number of stories. bbc.co.uk

Yep. I despise that design look and function too. Wasted space. Bloated web page. All those images tinted with the titles over the images. Hideous.

HN comment:

But websites and buildings are something that real people actually use, so fuck off with your -isms. Make websites where content is readable and easy to navigate. Make buildings that are great to live and work in as opposed to those whose mockups look unique and stunning in "Architectural wankery monthly".

HN comment:

Warren Buffett's holding company, Berkshire Hathaway, could go on this list. Berkshire's operating subsidiaries (Geico, Duracell, Heinz, etc) have fancy modern websites but the investment company's looks like it predates Geocities. IIRC this is because they don't want to spend money on an already functional website that isn't really selling anything.


As an overt visual design paradigm, meh. But hallelujah to the idea of a page that just has content, without the trendily de rigeur fucktons of overblown css and pointless javascript that adds 0 and only serves to crash my crappy mobile browser.

Simple sites/pages mentioned in that brutalist HN thread:

http://37signals.com is mentioned at http://brutalistwebsites.com/

At the top of brutalistwebsites.com:

Brutalist Websites

In its ruggedness and lack of concern to look comfortable or easy, Brutalism can be seen as a reaction by a younger generation to the lightness, optimism, and frivolity of today's webdesign.


37signals (Jason Fried)

Q: Why do you have a Brutalist Website?
A: I see it more as a letter, not a web site. It just happens to be on the web. Also, re: "In its ruggedness and lack of concern to look comfortable or easy"... For me it's all about comfort and ease - especially when it comes to reading. I don't consider our site rugged any more than a simple letter is rugged.


If there were no bloated ads, some top websites would load up to 90% faster.

Proper terminology was used. Websites would load faster not that the web would be faster. Websites are the problem not the web.



And that doesn’t even speak to the arguably more important issue: page load time. Compared to an AMP page, trying to load a regular page of content on the web feels like trying to suck it in through a straw. A very tiny straw.

If anyone saw a regular page of web content side-by-side with an AMP’d page, there’s no question they’d choose to see the latter, every single time.

Because on the desktop we’re all used to seeing the absolute worst of the web. That is, ridiculous widgets, awful JavaScript load times, and, of course, ads galore. AMP stripped all of the crud away and just gave me unadulterated content. And gave it to me fast.
It was such a revelation. I wanted to view all web-based content this way. Not just on mobile, everywhere.

Welcome to the dark side of wanting faster, simpler websites.

Same WSJ article, two formats, mentioned in the above post:

From: Dulles, VA - Chrome - Cable - 4/28/2016, 11:20:00 AM
First View Fully Loaded:
268 requests
4,642 KB

From: Dulles, VA - Chrome - Cable - 4/28/2016, 11:20:52 AM
27 requests
425 KB

Same article, big diff in speed and reader-friendliness.

Why not create the simpler format by default?

Add /amp to the end of The Guardian URLs.

For WaPo stories, add "amphtml" after the .com/ and before the rest of the story's URL.


... page weight does matter. Access can be slow, expensive and prohibitive."



"Speed is one of those features people don’t ask for but really appreciate. It makes everything else you build a lot better." CEO of Pinterest, Silbermann http://www.wired.com/2016/04/pinterest-reinvents-prove-really-worth-billions/


May 2016


JavaScript has brought the web to the brink of ruin, but there’s no JavaScript in podcasting. Just an RSS feed and MP3 files.

I understand what he's saying, but I would qualify it more.

The misuse of JavaScript has brought the web to the brink of ruin.

Who gets to define the misuse? Does the article page or site function fine without JavaScript? What does a thousand pounds of JavaScript files do for the single article page that's accessed by a browsing-only reader?

For a site that requires the user to login, then I expect the dashboard or whatever to employ JavaScript in an elegant manner.

Using JavaScript for useless extravagance is breaking the web.







top hn comment:

On a more serious note - RSS is the Great Web Leveller. It spites your fancy CSS hacks, it's disgusted by your insane javascript, and it will piss all over your "mobile-optimized" crap. No semantic markup == no party; because markup is for robots, and RSS parsers are very stubborn robots that can see through web-hipster bullshit like Superman through walls.

another HN comment well down in the thread:

Regarding reading news on the Kindle: I've noticed that the browser becomes much more responsive (read: usable) if one disables JavaScript altogether. Sure, some websites will break, but most will work well enough to be able to read articles.

Disabling JavaScript is also my trick for actually being able to browse the internet on a phone these days. In Firefox for Android, you can install a plugin to toggle it on/off for when you need it.

A bit sad that you have to do this to get at decent experience, but what can you do...

That's me, except on desktop/laptop. The latest, greatest CPU is needed to read websites, especially those produced by media orgs.

Pages load incredibly fast when javascript is disabled. Any slowness is probably due to either Firefox or the NoScript add-on/plug-in.

Another HN comment:

I develop and maintain an open source RSS reader. In my experience it's not so bad. I strip CSS and javascript from feeds, and most of them are displayed fine anyway. I don't think I've ever found a feed that needed javascript to load content, it seems even SPAs include plain entry content in their feeds, thankfully. I've never found a feed that became unreadable after stripping styling either.

I agree it's interesting to look at your content when loaded in an RSS reader. IMHO most feeds are actually more readable when loaded in a clean uncluttered RSS reader than in the original webpage. If the content is good, the reading experience should not be harmed by focusing just on its text and images and removing extra styling.

Shameless plug: the RSS reader I maintain is https://www.feedbunch.com , comments are welcome.

Then why include CSS and/or JavaScript? Probably because the JavaScript is used for navigation, some kind of scrolling extravagance, image display, ads, trackers, etc.

#web - #javascript - #css - #html5 - #responsive - #design - #moronism - #blog_jr

By JR - 8316 words
created: - updated: - replies: 1
source - versions - backlinks

Related articles
Web vs Native apps in 2013 and beyond - Oct 22, 2013
Interesting and related Treehouse online classes - May 08, 2013
Forbes.com with javascript disabled - Jun 08, 2016
LA Times new website design - May 2014 - May 27, 2014
Journalist baffled by Quartz.com design - Dec 09, 2014
more >>

short url

A     A     A     A     A

© 2013-2017 JotHut - Online notebook

current date: Nov 29, 2022 - 3:55 a.m. EST