web

Autocomplete Interview - TXT Answer The Web's Most Searched Questions

Beomgyu, Taehyun, Yeonjun, Soobin and Huening Kai of TOMORROW X TOGETHER visit WIRED to answer their most searched for questions on Google. What does TXT mean? What is the best TXT song? Who is the best dancer in the group? What collective name do TOMORROW X TOGETHER fans go by? What is Yeonjun's ideal type? What is Soobin’s personality like? Is Taehyun a magician? What does Beomgyu smell like? Does Huening Kai have perfect pitch? TXT answer these questions and many, many more on the WIRED Autocomplete Interview.Gen Z Icons TOMORROW X TOGETHER returned with the 7th Mini Album, The Star Chapter: SANCTUARY. Available on all streaming platforms now!Director: Justin WolfsonEditor: J.Y. ChunTalent: Beomgyu; Choi Soo-bin; Huening Kai; Taehyun; YeonjunLine Producer: Joseph BuscemiAssociate Producer: Paul Gulyas; Brandon WhiteProduction Manager: Peter BrunetteProduction Coordinator: Rhyan LarkTalent Booker: Jenna Caldwell; Katie PearcePost Production Supervisor: Christian OlguinPost Production Coordinator: Ian BryantSupervising Editor: Doug LarsenAssistant Editor: Billy Ward




web

What exactly is Web Harm?

What is a Web Attack? Internet attacks concentrate on vulnerabilities in websites to gain unauthorized access, obtain secret information, propose destructive content, or perhaps alter the website’s content. That they can also introduce a denial of service to world wide web servers. XSS: Cross-Site Scripting (XSS) is an extremely common and widespread technique that allows […]




web

The Top 19 Sex Chat Websites & Grownup Chatrooms 18+ Free Entry

The website instantly advertises its male and female porn stars, its most popular strippers, and categories like male, feminine, trans, or couples. For one, 321 SexChat has tons of distinctive rooms for you to join, whether you would possibly be into furries, nymphs, saunas, or whatever else. Stay Cams/chat Critiques The primary cause that ladies […]




web

Free Intercourse Courting Sites Intercourse Websites Grownup Dating Websites

SilverSingles understands you’ve got to get alongside along with your informal associate. This most likely explains the detailed person profiles at this place – these help paint a fast picture of your matches before you really get to break the ice. This spot boasts a like-minded neighborhood that’s one hundred pc LGBTQ+. And, because the […]




web

Avast Vs Webroot – Which can be Better?

Avast has a wide selection of additional features to protect your house network and business by malware. The user-friendly software makes it easy to work and offers a brief signup process. The software is also affordable for many people and includes a solid status in the cybersecurity industry. The sole issue keeping it from being […]




web

This website Where Females Are Looking for Guys & You Can Meet Single Girls

Content Best “Mail Order Bride” Sites to look for Love In foreign countries When http://aevawedding.com/bosnian-brides you become a member of our online dating community, bored and lonely housewives from your location become visible to you. So , feel free to inform us your preferences and we will then assist you to meet the real matches. […]




web

5 various Do’s and Don’ts meant for Effective Web based Meetings

Virtual meetings have many benefits, which includes cost savings and productivity results. However , they can also be useless and effective online meetings frustrating in the event the participants are certainly not engaged. Here are a few simple do’s and don’ts to ensure successful online group meetings. Online getting together with attendees should be provided […]




web

Which can be Better Webroot Or Avast?

Which is better webroot or perhaps avast Both software programs protect against malwares and other digital threats. The two offer protection against ransomware and get a username and password manager. Nevertheless , Avast can be described as better choice when it comes to preventing malware and also other threats due to its faster tests and […]




web

PFRDA Website Overhaul: Regulator Body Invites IT Firms; Seeks To Improve User Experience

The overarching goal of the PFRDA-Connect project is to significantly enhance the digital presence of PFRDA by overhauling its official website, leading to improved user experience throughout the user journey. 




web

Indian Police Force Review | A No-logic Web Series

Created and directed by Rohit Shetty with a story and script by Sandeep Saket and Anusha Nandkumar, Ayush Trivedi, Vidhi Ghodgaonkar, and Sanchit Bedre, the web series Indian Police Force is directed by Rohit Shetty and Sushwanth Prakash. The Indian Police Force is live on OTT (Over-The-Top media service) platform Amazon Prime Video. The series […]

The post Indian Police Force Review | A No-logic Web Series appeared first on TIMES OF ASSAM by Dhruba Jyoti Deka.




web

Breaking the web forward

Safari is holding back the web. It is the new IE, after all. In contrast, Chrome is pushing the web forward so hard that it’s starting to break. Meanwhile web developers do nothing except moan and complain. The only thing left to do is to pick our poison.

Safari is the new IE

Recently there was yet another round of “Safari is the new IE” stories. Once Jeremy’s summary and a short discussion cleared my mind I finally figured out that Safari is not IE, and that Safari’s IE-or-not-IE is not the worst problem the web is facing.

Perry Sun argues that for developers, Safari is crap and outdated, emulating the old IE of fifteen years ago in this respect. He also repeats the theory that Apple is deliberately starving Safari of features in order to protect the app store, and thus its bottom line. We’ll get back to that.

The allegation that Safari is holding back web development by its lack of support for key features is not new, but it’s not true, either. Back fifteen years ago IE held back the web because web developers had to cater to its outdated technology stack. “Best viewed with IE” and all that. But do you ever see a “Best viewed with Safari” notice? No, you don’t. Another browser takes that special place in web developers’ hearts and minds.

Chrome is the new IE, but in reverse

Jorge Arango fears we’re going back to the bad old days with “Best viewed in Chrome.” Chris Krycho reinforces this by pointing out that, even though Chrome is not the standard, it’s treated as such by many web developers.

“Best viewed in Chrome” squares very badly with “Safari is the new IE.” Safari’s sad state does not force web developers to restrict themselves to Safari-supported features, so it does not hold the same position as IE.

So I propose to lay this tired old meme to rest. Safari is not the new IE. If anything it’s the new Netscape 4.

Meanwhile it is Chrome that is the new IE, but in reverse.

Break the web forward

Back in the day, IE was accused of an embrace, extend, and extinguish strategy. After IE6 Microsoft did nothing for ages, assuming it had won the web. Thanks to web developers taking action in their own name for the first (and only) time, IE was updated once more and the web moved forward again.

Google learned from Microsoft’s mistakes and follows a novel embrace, extend, and extinguish strategy by breaking the web and stomping on the bits. Who cares if it breaks as long as we go forward. And to hell with backward compatibility.

Back in 2015 I proposed to stop pushing the web forward, and as expected the Chrome devrels were especially outraged at this idea. It never went anywhere. (Truth to tell: I hadn’t expected it to.)

I still think we should stop pushing the web forward for a while until we figure out where we want to push the web forward to — but as long as Google is in charge that won’t happen. It will only get worse.

On alert

A blog storm broke out over the decision to remove alert(), confirm() and prompt(), first only the cross-origin variants, but eventually all of them. Jeremy and Chris Coyier already summarised the situation, while Rich Harris discusses the uses of the three ancient modals, especially when it comes to learning JavaScript.

With all these articles already written I will only note that, if the three ancient modals are truly as horrendous a security issue as Google says they are it took everyone a bloody long time to figure that out. I mean, they turn 25 this year.

Although it appears Firefox and Safari are on board with at least the cross-origin part of the proposal, there is no doubt that it’s Google that leads the charge.

From Google’s perspective the ancient modals have one crucial flaw quite apart from their security model: they weren’t invented there. That’s why they have to be replaced by — I don’t know what, but it will likely be a very complicated API.

Complex systems and arrogant priests rule the web

Thus the new embrace, extend, and extinguish is breaking backward compatibility in order to make the web more complicated. Nolan Lawson puts it like this:

we end up with convoluted specs like Service Worker that you need a PhD to understand, and yet we still don't have a working <dialog> element.

In addition, Google can be pretty arrogant and condescending, as Chris Ferdinandi points out.

The condescending “did you actually read it, it’s so clear” refrain is patronizing AF. It’s the equivalent of “just” or “simply” in developer documentation.

I read it. I didn’t understand it. That’s why I asked someone whose literal job is communicating with developers about changes Chrome makes to the platform.

This is not isolated to one developer at Chrome. The entire message thread where this change was surfaced is filled with folks begging Chrome not to move forward with this proposal because it will break all-the-things.

If you write documentation or a technical article and nobody understands it, you’ve done a crappy job. I should know; I’ve been writing this stuff for twenty years.

Extend, embrace, extinguish. And use lots of difficult words.

Patience is a virtue

As a reaction to web dev outcry Google temporarily halted the breaking of the web. That sounds great but really isn’t. It’s just a clever tactical move.

I saw this tactic in action before. Back in early 2016 Google tried to break the de-facto standard for the mobile visual viewport that I worked very hard to establish. I wrote a piece that resonated with web developers, whose complaints made Google abandon the plan — temporarily. They tried again in late 2017, and I again wrote an article, but this time around nobody cared and the changes took effect and backward compatibility was broken.

So the three ancient modals still have about 12 to 18 months to live. Somewhere in late 2022 to early 2023 Google will try again, web developers will be silent, and the modals will be gone.

The pursuit of appiness

But why is Google breaking the web forward at such a pace? And why is Apple holding it back?

Safari is kept dumb to protect the app store and thus revenue. In contrast, the Chrome team is pushing very hard to port every single app functionality to the browser. Ages ago I argued we should give up on this, but of course no one listened.

When performing Valley Kremlinology, it is useful to see Google policies as stemming from a conflict between internal pro-web and anti-web factions. We web developers mainly deal with the pro-web faction, the Chrome devrel and browser teams. On the other hand, the Android team is squarely in the anti-web camp.

When seen in this light the pro-web camp’s insistence on copying everything appy makes excellent sense: if they didn’t Chrome would lag behind apps and the Android anti-web camp would gain too much power. While I prefer the pro-web over the anti-web camp, I would even more prefer the web not to be a pawn in an internal Google power struggle. But it has come to that, no doubt about it.

Solutions?

Is there any good solution? Not really.

Jim Nielsen feels that part of the issue is the lack of representation of web developers in the standardization process. That sounds great but is proven not to work.

Three years ago Fronteers and I attempted to get web developers represented and were met with absolute disinterest. Nobody else cared even one shit, and the initiative sank like a stone.

So a hypothetical web dev representative in W3C is not going to work. Also, the organisational work would involve a lot of unpaid labour, and I, for one, am not willing to do it again. Neither is anyone else. So this is not the solution.

And what about Firefox? Well, what about it? Ten years ago it made a disastrous mistake by ignoring the mobile web for way too long, then it attempted an arrogant and uninformed come-back with Firefox OS that failed, and its history from that point on is one long slide into obscurity. That’s what you get with shitty management.

Pick your poison

So Safari is trying to slow the web down. With Google’s move-fast-break-absofuckinglutely-everything axiom in mind, is Safari’s approach so bad?

Regardless of where you feel the web should be on this spectrum between Google and Apple, there is a fundamental difference between the two.

We have the tools and procedures to manage Safari’s disinterest. They’re essentially the same as the ones we deployed against Microsoft back in the day — though a fundamental difference is that Microsoft was willing to talk while Apple remains its old haughty self, and its “devrels” aren’t actually allowed to do devrelly things such as managing relations with web developers. (Don’t blame them, by the way. If something would ever change they’re going to be our most valuable internal allies — just as the IE team was back in the day.)

On the other hand, we have no process for countering Google’s reverse embrace, extend, and extinguish strategy, since a section of web devs will be enthusiastic about whatever the newest API is. Also, Google devrels talk. And talk. And talk. And provide gigs of data that are hard to make sense of. And refer to their proprietary algorithms that “clearly” show X is in the best interest of the web — and don’t ask questions! And make everything so fucking complicated that we eventually give up and give in.

So pick your poison. Shall we push the web forward until it’s broken, or shall we break it by inaction? What will it be? Privately, my money is on Google. So we should say goodbye to the old web while we still can.




web

What do you need to make a successful web app?

Here’s some things you need to make a successful web app: A plan to make an application that helps real people to make their lives easier, solving a well-researched problem Understand human psychology Know how to design, both in terms of UX flow and visual design A marketing plan, to tell potential customers that your […]




web

Sustainable Web Design, An Excerpt

In the 1950s, many in the elite running community had begun to believe it wasn’t possible to run a mile in less than four minutes. Runners had been attempting it since the late 19th century and were beginning to draw the conclusion that the human body simply wasn’t built for the task. 

But on May 6, 1956, Roger Bannister took everyone by surprise. It was a cold, wet day in Oxford, England—conditions no one expected to lend themselves to record-setting—and yet Bannister did just that, running a mile in 3:59.4 and becoming the first person in the record books to run a mile in under four minutes. 

This shift in the benchmark had profound effects; the world now knew that the four-minute mile was possible. Bannister’s record lasted only forty-six days, when it was snatched away by Australian runner John Landy. Then a year later, three runners all beat the four-minute barrier together in the same race. Since then, over 1,400 runners have officially run a mile in under four minutes; the current record is 3:43.13, held by Moroccan athlete Hicham El Guerrouj.

We achieve far more when we believe that something is possible, and we will believe it’s possible only when we see someone else has already done it—and as with human running speed, so it is with what we believe are the hard limits for how a website needs to perform.

Establishing standards for a sustainable web

In most major industries, the key metrics of environmental performance are fairly well established, such as miles per gallon for cars or energy per square meter for homes. The tools and methods for calculating those metrics are standardized as well, which keeps everyone on the same page when doing environmental assessments. In the world of websites and apps, however, we aren’t held to any particular environmental standards, and only recently have gained the tools and methods we need to even make an environmental assessment.

The primary goal in sustainable web design is to reduce carbon emissions. However, it’s almost impossible to actually measure the amount of CO2 produced by a web product. We can’t measure the fumes coming out of the exhaust pipes on our laptops. The emissions of our websites are far away, out of sight and out of mind, coming out of power stations burning coal and gas. We have no way to trace the electrons from a website or app back to the power station where the electricity is being generated and actually know the exact amount of greenhouse gas produced. So what do we do? 

If we can’t measure the actual carbon emissions, then we need to find what we can measure. The primary factors that could be used as indicators of carbon emissions are:

  1. Data transfer 
  2. Carbon intensity of electricity

Let’s take a look at how we can use these metrics to quantify the energy consumption, and in turn the carbon footprint, of the websites and web apps we create.

Data transfer

Most researchers use kilowatt-hours per gigabyte (kWh/GB) as a metric of energy efficiency when measuring the amount of data transferred over the internet when a website or application is used. This provides a great reference point for energy consumption and carbon emissions. As a rule of thumb, the more data transferred, the more energy used in the data center, telecoms networks, and end user devices.

For web pages, data transfer for a single visit can be most easily estimated by measuring the page weight, meaning the transfer size of the page in kilobytes the first time someone visits the page. It’s fairly easy to measure using the developer tools in any modern web browser. Often your web hosting account will include statistics for the total data transfer of any web application (Fig 2.1).

Fig 2.1: The Kinsta hosting dashboard displays data transfer alongside traffic volumes. If you divide data transfer by visits, you get the average data per visit, which can be used as a metric of efficiency.

The nice thing about page weight as a metric is that it allows us to compare the efficiency of web pages on a level playing field without confusing the issue with constantly changing traffic volumes. 

Reducing page weight requires a large scope. By early 2020, the median page weight was 1.97 MB for setups the HTTP Archive classifies as “desktop” and 1.77 MB for “mobile,” with desktop increasing 36 percent since January 2016 and mobile page weights nearly doubling in the same period (Fig 2.2). Roughly half of this data transfer is image files, making images the single biggest source of carbon emissions on the average website. 

History clearly shows us that our web pages can be smaller, if only we set our minds to it. While most technologies become ever more energy efficient, including the underlying technology of the web such as data centers and transmission networks, websites themselves are a technology that becomes less efficient as time goes on.

Fig 2.2: The historical page weight data from HTTP Archive can teach us a lot about what is possible in the future.

You might be familiar with the concept of performance budgeting as a way of focusing a project team on creating faster user experiences. For example, we might specify that the website must load in a maximum of one second on a broadband connection and three seconds on a 3G connection. Much like speed limits while driving, performance budgets are upper limits rather than vague suggestions, so the goal should always be to come in under budget.

Designing for fast performance does often lead to reduced data transfer and emissions, but it isn’t always the case. Web performance is often more about the subjective perception of load times than it is about the true efficiency of the underlying system, whereas page weight and transfer size are more objective measures and more reliable benchmarks for sustainable web design. 

We can set a page weight budget in reference to a benchmark of industry averages, using data from sources like HTTP Archive. We can also benchmark page weight against competitors or the old version of the website we’re replacing. For example, we might set a maximum page weight budget as equal to our most efficient competitor, or we could set the benchmark lower to guarantee we are best in class. 

If we want to take it to the next level, then we could also start looking at the transfer size of our web pages for repeat visitors. Although page weight for the first time someone visits is the easiest thing to measure, and easy to compare on a like-for-like basis, we can learn even more if we start looking at transfer size in other scenarios too. For example, visitors who load the same page multiple times will likely have a high percentage of the files cached in their browser, meaning they don’t need to transfer all of the files on subsequent visits. Likewise, a visitor who navigates to new pages on the same website will likely not need to load the full page each time, as some global assets from areas like the header and footer may already be cached in their browser. Measuring transfer size at this next level of detail can help us learn even more about how we can optimize efficiency for users who regularly visit our pages, and enable us to set page weight budgets for additional scenarios beyond the first visit.

Page weight budgets are easy to track throughout a design and development process. Although they don’t actually tell us carbon emission and energy consumption analytics directly, they give us a clear indication of efficiency relative to other websites. And as transfer size is an effective analog for energy consumption, we can actually use it to estimate energy consumption too.

In summary, reduced data transfer translates to energy efficiency, a key factor to reducing carbon emissions of web products. The more efficient our products, the less electricity they use, and the less fossil fuels need to be burned to produce the electricity to power them. But as we’ll see next, since all web products demand some power, it’s important to consider the source of that electricity, too.

Carbon intensity of electricity

Regardless of energy efficiency, the level of pollution caused by digital products depends on the carbon intensity of the energy being used to power them. Carbon intensity is a term used to define the grams of CO2 produced for every kilowatt-hour of electricity (gCO2/kWh). This varies widely, with renewable energy sources and nuclear having an extremely low carbon intensity of less than 10 gCO2/kWh (even when factoring in their construction); whereas fossil fuels have very high carbon intensity of approximately 200–400 gCO2/kWh. 

Most electricity comes from national or state grids, where energy from a variety of different sources is mixed together with varying levels of carbon intensity. The distributed nature of the internet means that a single user of a website or app might be using energy from multiple different grids simultaneously; a website user in Paris uses electricity from the French national grid to power their home internet and devices, but the website’s data center could be in Dallas, USA, pulling electricity from the Texas grid, while the telecoms networks use energy from everywhere between Dallas and Paris.

We don’t have control over the full energy supply of web services, but we do have some control over where we host our projects. With a data center using a significant proportion of the energy of any website, locating the data center in an area with low carbon energy will tangibly reduce its carbon emissions. Danish startup Tomorrow reports and maps this user-contributed data, and a glance at their map shows how, for example, choosing a data center in France will have significantly lower carbon emissions than a data center in the Netherlands (Fig 2.3).

Fig 2.3: Tomorrow’s electricityMap shows live data for the carbon intensity of electricity by country.

That said, we don’t want to locate our servers too far away from our users; it takes energy to transmit data through the telecom’s networks, and the further the data travels, the more energy is consumed. Just like food miles, we can think of the distance from the data center to the website’s core user base as “megabyte miles”—and we want it to be as small as possible.

Using the distance itself as a benchmark, we can use website analytics to identify the country, state, or even city where our core user group is located and measure the distance from that location to the data center used by our hosting company. This will be a somewhat fuzzy metric as we don’t know the precise center of mass of our users or the exact location of a data center, but we can at least get a rough idea. 

For example, if a website is hosted in London but the primary user base is on the West Coast of the USA, then we could look up the distance from London to San Francisco, which is 5,300 miles. That’s a long way! We can see that hosting it somewhere in North America, ideally on the West Coast, would significantly reduce the distance and thus the energy used to transmit the data. In addition, locating our servers closer to our visitors helps reduce latency and delivers better user experience, so it’s a win-win.

Converting it back to carbon emissions

If we combine carbon intensity with a calculation for energy consumption, we can calculate the carbon emissions of our websites and apps. A tool my team created does this by measuring the data transfer over the wire when loading a web page, calculating the amount of electricity associated, and then converting that into a figure for CO2 (Fig 2.4). It also factors in whether or not the web hosting is powered by renewable energy.

If you want to take it to the next level and tailor the data more accurately to the unique aspects of your project, the Energy and Emissions Worksheet accompanying this book shows you how.

Fig 2.4: The Website Carbon Calculator shows how the Riverford Organic website embodies their commitment to sustainability, being both low carbon and hosted in a data center using renewable energy.

With the ability to calculate carbon emissions for our projects, we could actually take a page weight budget one step further and set carbon budgets as well. CO2 is not a metric commonly used in web projects; we’re more familiar with kilobytes and megabytes, and can fairly easily look at design options and files to assess how big they are. Translating that into carbon adds a layer of abstraction that isn’t as intuitive—but carbon budgets do focus our minds on the primary thing we’re trying to reduce, and support the core objective of sustainable web design: reducing carbon emissions.

Browser Energy

Data transfer might be the simplest and most complete analog for energy consumption in our digital projects, but by giving us one number to represent the energy used in the data center, the telecoms networks, and the end user’s devices, it can’t offer us insights into the efficiency in any specific part of the system.

One part of the system we can look at in more detail is the energy used by end users’ devices. As front-end web technologies become more advanced, the computational load is increasingly moving from the data center to users’ devices, whether they be phones, tablets, laptops, desktops, or even smart TVs. Modern web browsers allow us to implement more complex styling and animation on the fly using CSS and JavaScript. Furthermore, JavaScript libraries such as Angular and React allow us to create applications where the “thinking” work is done partly or entirely in the browser. 

All of these advances are exciting and open up new possibilities for what the web can do to serve society and create positive experiences. However, more computation in the user’s web browser means more energy used by their devices. This has implications not just environmentally, but also for user experience and inclusivity. Applications that put a heavy processing load on the user’s device can inadvertently exclude users with older, slower devices and cause batteries on phones and laptops to drain faster. Furthermore, if we build web applications that require the user to have up-to-date, powerful devices, people throw away old devices much more frequently. This isn’t just bad for the environment, but it puts a disproportionate financial burden on the poorest in society.

In part because the tools are limited, and partly because there are so many different models of devices, it’s difficult to measure website energy consumption on end users’ devices. One tool we do currently have is the Energy Impact monitor inside the developer console of the Safari browser (Fig 2.5).

Fig 2.5: The Energy Impact meter in Safari (on the right) shows how a website consumes CPU energy.

You know when you load a website and your computer’s cooling fans start spinning so frantically you think it might actually take off? That’s essentially what this tool is measuring. 

It shows us the percentage of CPU used and the duration of CPU usage when loading the web page, and uses these figures to generate an energy impact rating. It doesn’t give us precise data for the amount of electricity used in kilowatts, but the information it does provide can be used to benchmark how efficiently your websites use energy and set targets for improvement.




web

The Wax and the Wane of the Web

I offer a single bit of advice to friends and family when they become new parents: When you start to think that you’ve got everything figured out, everything will change. Just as you start to get the hang of feedings, diapers, and regular naps, it’s time for solid food, potty training, and overnight sleeping. When you figure those out, it’s time for preschool and rare naps. The cycle goes on and on.

The same applies for those of us working in design and development these days. Having worked on the web for almost three decades at this point, I’ve seen the regular wax and wane of ideas, techniques, and technologies. Each time that we as developers and designers get into a regular rhythm, some new idea or technology comes along to shake things up and remake our world.

How we got here

I built my first website in the mid-’90s. Design and development on the web back then was a free-for-all, with few established norms. For any layout aside from a single column, we used table elements, often with empty cells containing a single pixel spacer GIF to add empty space. We styled text with numerous font tags, nesting the tags every time we wanted to vary the font style. And we had only three or four typefaces to choose from: Arial, Courier, or Times New Roman. When Verdana and Georgia came out in 1996, we rejoiced because our options had nearly doubled. The only safe colors to choose from were the 216 “web safe” colors known to work across platforms. The few interactive elements (like contact forms, guest books, and counters) were mostly powered by CGI scripts (predominantly written in Perl at the time). Achieving any kind of unique look involved a pile of hacks all the way down. Interaction was often limited to specific pages in a site.

The birth of web standards

At the turn of the century, a new cycle started. Crufty code littered with table layouts and font tags waned, and a push for web standards waxed. Newer technologies like CSS got more widespread adoption by browsers makers, developers, and designers. This shift toward standards didn’t happen accidentally or overnight. It took active engagement between the W3C and browser vendors and heavy evangelism from folks like the Web Standards Project to build standards. A List Apart and books like Designing with Web Standards by Jeffrey Zeldman played key roles in teaching developers and designers why standards are important, how to implement them, and how to sell them to their organizations. And approaches like progressive enhancement introduced the idea that content should be available for all browsers—with additional enhancements available for more advanced browsers. Meanwhile, sites like the CSS Zen Garden showcased just how powerful and versatile CSS can be when combined with a solid semantic HTML structure.

Server-side languages like PHP, Java, and .NET overtook Perl as the predominant back-end processors, and the cgi-bin was tossed in the trash bin. With these better server-side tools came the first era of web applications, starting with content-management systems (particularly in the blogging space with tools like Blogger, Grey Matter, Movable Type, and WordPress). In the mid-2000s, AJAX opened doors for asynchronous interaction between the front end and back end. Suddenly, pages could update their content without needing to reload. A crop of JavaScript frameworks like Prototype, YUI, and jQuery arose to help developers build more reliable client-side interaction across browsers that had wildly varying levels of standards support. Techniques like image replacement let crafty designers and developers display fonts of their choosing. And technologies like Flash made it possible to add animations, games, and even more interactivity.

These new technologies, standards, and techniques reinvigorated the industry in many ways. Web design flourished as designers and developers explored more diverse styles and layouts. But we still relied on tons of hacks. Early CSS was a huge improvement over table-based layouts when it came to basic layout and text styling, but its limitations at the time meant that designers and developers still relied heavily on images for complex shapes (such as rounded or angled corners) and tiled backgrounds for the appearance of full-length columns (among other hacks). Complicated layouts required all manner of nested floats or absolute positioning (or both). Flash and image replacement for custom fonts was a great start toward varying the typefaces from the big five, but both hacks introduced accessibility and performance problems. And JavaScript libraries made it easy for anyone to add a dash of interaction to pages, although at the cost of doubling or even quadrupling the download size of simple websites.

The web as software platform

The symbiosis between the front end and back end continued to improve, and that led to the current era of modern web applications. Between expanded server-side programming languages (which kept growing to include Ruby, Python, Go, and others) and newer front-end tools like React, Vue, and Angular, we could build fully capable software on the web. Alongside these tools came others, including collaborative version control, build automation, and shared package libraries. What was once primarily an environment for linked documents became a realm of infinite possibilities.

At the same time, mobile devices became more capable, and they gave us internet access in our pockets. Mobile apps and responsive design opened up opportunities for new interactions anywhere and any time.

This combination of capable mobile devices and powerful development tools contributed to the waxing of social media and other centralized tools for people to connect and consume. As it became easier and more common to connect with others directly on Twitter, Facebook, and even Slack, the desire for hosted personal sites waned. Social media offered connections on a global scale, with both the good and bad that that entails.

Want a much more extensive history of how we got here, with some other takes on ways that we can improve? Jeremy Keith wrote “Of Time and the Web.” Or check out the “Web Design History Timeline” at the Web Design Museum. Neal Agarwal also has a fun tour through “Internet Artifacts.”

Where we are now

In the last couple of years, it’s felt like we’ve begun to reach another major inflection point. As social-media platforms fracture and wane, there’s been a growing interest in owning our own content again. There are many different ways to make a website, from the tried-and-true classic of hosting plain HTML files to static site generators to content management systems of all flavors. The fracturing of social media also comes with a cost: we lose crucial infrastructure for discovery and connection. Webmentions, RSS, ActivityPub, and other tools of the IndieWeb can help with this, but they’re still relatively underimplemented and hard to use for the less nerdy. We can build amazing personal websites and add to them regularly, but without discovery and connection, it can sometimes feel like we may as well be shouting into the void.

Browser support for CSS, JavaScript, and other standards like web components has accelerated, especially through efforts like Interop. New technologies gain support across the board in a fraction of the time that they used to. I often learn about a new feature and check its browser support only to find that its coverage is already above 80 percent. Nowadays, the barrier to using newer techniques often isn’t browser support but simply the limits of how quickly designers and developers can learn what’s available and how to adopt it.

Today, with a few commands and a couple of lines of code, we can prototype almost any idea. All the tools that we now have available make it easier than ever to start something new. But the upfront cost that these frameworks may save in initial delivery eventually comes due as upgrading and maintaining them becomes a part of our technical debt.

If we rely on third-party frameworks, adopting new standards can sometimes take longer since we may have to wait for those frameworks to adopt those standards. These frameworks—which used to let us adopt new techniques sooner—have now become hindrances instead. These same frameworks often come with performance costs too, forcing users to wait for scripts to load before they can read or interact with pages. And when scripts fail (whether through poor code, network issues, or other environmental factors), there’s often no alternative, leaving users with blank or broken pages.

Where do we go from here?

Today’s hacks help to shape tomorrow’s standards. And there’s nothing inherently wrong with embracing hacks—for now—to move the present forward. Problems only arise when we’re unwilling to admit that they’re hacks or we hesitate to replace them. So what can we do to create the future we want for the web?

Build for the long haul. Optimize for performance, for accessibility, and for the user. Weigh the costs of those developer-friendly tools. They may make your job a little easier today, but how do they affect everything else? What’s the cost to users? To future developers? To standards adoption? Sometimes the convenience may be worth it. Sometimes it’s just a hack that you’ve grown accustomed to. And sometimes it’s holding you back from even better options.

Start from standards. Standards continue to evolve over time, but browsers have done a remarkably good job of continuing to support older standards. The same isn’t always true of third-party frameworks. Sites built with even the hackiest of HTML from the ’90s still work just fine today. The same can’t always be said of sites built with frameworks even after just a couple years.

Design with care. Whether your craft is code, pixels, or processes, consider the impacts of each decision. The convenience of many a modern tool comes at the cost of not always understanding the underlying decisions that have led to its design and not always considering the impact that those decisions can have. Rather than rushing headlong to “move fast and break things,” use the time saved by modern tools to consider more carefully and design with deliberation.

Always be learning. If you’re always learning, you’re also growing. Sometimes it may be hard to pinpoint what’s worth learning and what’s just today’s hack. You might end up focusing on something that won’t matter next year, even if you were to focus solely on learning standards. (Remember XHTML?) But constant learning opens up new connections in your brain, and the hacks that you learn one day may help to inform different experiments another day.

Play, experiment, and be weird! This web that we’ve built is the ultimate experiment. It’s the single largest human endeavor in history, and yet each of us can create our own pocket within it. Be courageous and try new things. Build a playground for ideas. Make goofy experiments in your own mad science lab. Start your own small business. There has never been a more empowering place to be creative, take risks, and explore what we’re capable of.

Share and amplify. As you experiment, play, and learn, share what’s worked for you. Write on your own website, post on whichever social media site you prefer, or shout it from a TikTok. Write something for A List Apart! But take the time to amplify others too: find new voices, learn from them, and share what they’ve taught you.

Go forth and make

As designers and developers for the web (and beyond), we’re responsible for building the future every day, whether that may take the shape of personal websites, social media tools used by billions, or anything in between. Let’s imbue our values into the things that we create, and let’s make the web a better place for everyone. Create that thing that only you are uniquely qualified to make. Then share it, make it better, make it again, or make something new. Learn. Make. Share. Grow. Rinse and repeat. Every time you think that you’ve mastered the web, everything will change.




web

How we’re adding Lottie animations to our website and product ????

We’ve been rolling out a fresh wave of Lottie animations ???? to our website and product.

Here’s how we’re doing it ????✨…

Read on Twitter or Read on Medium




web

India trying to fix hacked websites of seven of its embassies




web

Weber Revisited: The Protestant Ethic and the Spirit of Nationalism [electronic journal].




web

Cannabis Prices on the Dark Web [electronic journal].




web

Solutions and Tools for Dealing with Broken Links in Web Pages

A couple of months ago a post by Leo Blanchette got to the front page of Hacker News and there was an interesting discussion on dealing with broken links and external content – the main problem being links that become out of date due to paywalls, altered content, or content getting taken down.

I’ve been running this blog since May 2008. If you’ve run a content-driven site for even a fraction of that, you know that link rot is a problem. In this post I’ll go over some of the suggestions in that thread along with some tools to use to check for broken links.

The post Solutions and Tools for Dealing with Broken Links in Web Pages appeared first on Impressive Webs.



  • HTML5
  • Web Design Articles

web

542: Breaking Up with CSS-in-JS, Mastodon, and Stories on the Web

We're talking CSS-in-JS, Token CSS, Matuzo being suspended from Twitter, trying out Mastodon, testing out stable diffusion, stories on the web, and Jake Albaugh's new social network.




web

557: ChatGPT, Conferences, Fidgets on the Web, and Modern CSS in Real Life

When will AI be able to tell you the risk / reward of cleaning up trees? Are conferences back? Bringing fidgets to the web, internet as an anxiety machine, and Chris is working on talk on modern CSS in real life.




web

561: Web Perf News, Web Sommelier, Data Analytics, and Passkeys

Topics for this one include how do you learn about web performance news? Do you need a web components sommelier? Our thoughts on Syntax going to Sentry, and being able to focus on the things you want to focus on. Passkeys, Arc split screen, and vibe driven development.




web

569: Apple’s Web Apps, Meta Quest and Vision Pro, and Missing Sticky Headers

How do you point out things in a UI? Are Arc Boosts the end of the web? What do you think of VR and AR / Vision Pro and Meta Quest? And what do you do when the sticky header goes missing?




web

571: Searching vs AI, Getting Designers to Play Nice, and Web Components

Do you listen at 2x? Do Chris and Dave sound weird at normal speed IRL? How searching compares to using AI, chatbots kind of suck at context, getting a designer to work with developers at an agency, what happened to content visibility, and how to best build a design system using web components.




web

576: Blocks, Components, Linting Images, Engines, and “Web Integrity”

We're talking how we stay online - or not - on vacation, is create-guten-block the future for us WP developers? Can we get a state of the web component address from the President of web components? Have we seen the last new browser engine? And deciding whether to add features or remove them from your app.




web

585: Blog Redesign, Sounds on a Website, Accessibility Tests, and Safari 17

Chris redesigned his blog, using sounds on your website to make it seem fancy, what can't automated accessibility tests test, and what's new in Safari 17.




web

588: Elliott Marquez on Web Components and Lit

Elliot Marquez talks with us about the history of Polymer and Lit, why you should pick Lit, working with web components, the shadow dom, managing state, and how Material design is built with web components.




web

590: Twisting Through Websites

The excitement of launching Luro, changes in social media platforms, different seasons for coding and marketing, embedded social media post weight, CSS thoughts from Web Unleashed, focus state issues, and fact checking and updating old posts on your blog.




web

592: Web Component Therapy, SEO Therapy, and Learning Something New like Swift

Talking web components, progressive enhancement, style-able components, having to pay before you get to see a demo, being annoyed at the business of SEO, and subscriptions vs ads.




web

593: Beep & Texts, Tumblr, JavaScript & Web Components, & Cool Blog Post Ideas

Thoughts on smashing all communication messaging apps together, what's happened to Tumblr under Automattic, what the situation is with native web components and JavaScript, and looking at a list of types of blog posts.




web

598: Jen Simmons on Interop, WebKit Releases, and New CSS Features in Safari

Jen Simmons, Apple Evangelist on the Web Developer Experience team for Safari & Webkit, stops by to talk about what Interop is, and a look ahead at new CSS features in Webkit and Safari such as JPEG XL, masks, a round function, JavaScript improvements, styling form controls, content unblocks, masonry, and more!




web

599: Fighting the Algorithm With RSS, Blogging, and the IndieWeb

Dave and Chris discuss indie web culture, the role of social media in today's society, and the challenges and strategies of freelancing. Additionally, they discuss a range of topics from content moderation, coding and refining tech skills, to emerging startups and the future of web technology.




web

600: Where Will The Web Be 12 Years from Now?

We've got your feedback as well as our thoughts on where we all think the web will be in 2036 - as we celebrate 12 years of ShopTalk Show history, we're looking forward to what's to come with ideas around cookie banners, undo, no more passwords, React, Deno, Node, and Mozilla's future, ChatGPT's thoughts, accessibility, blockchain, VR / AR, hoverboards, P3 color space, indie web, JS bundle sizes, and more!




web

606: Web Sustainability with Michelle Barker

Show DescriptionWe're talking with Michelle Barker about the idea of paying to support bloggers (and podcasters!) via services like Patreon, drumming as a fun side gig from CSS, how big of an issue digital sustainability is, trying to understand the environmental impact of our websites and digital life, wondering why YouTube embeds are still so […]




web

608: Can WordPress Kill Your Resume, Fav Parts of Web Dev, Exploring HTMX, and more!

We're opening up the ShopTalk mailbag and answering your questions, including does WordPress on your resume kill your job chances, what are our fav and least fav parts of web dev, our thoughts on HTMX, and what is it like to use pnpm instead of npm.




web

613: Recording Live Music, WebC, Open Source, & WordPress Studio

Chris bought recording gear off an Instagram ad, our thoughts on WebC, CodePen upgrades Yarn, thoughts on the commercial value of open source, Automattic releases an app to install WordPress locally, IBM buys Hashicorp, income tax software, and a hack for getting Safari to respect background colors used in a pseudo selector.




web

622: Website Rendering, Updating Software, and Edge Gets Faster

We're talking website rendering, server side rendering, Astro's server islands, perf hits for navigation elements, updating software because the docs aren't available for older versions, and a new Microsoft Edge was released.




web

629: The Great Divide, Global Design + Web Components, and Job Titles

A bit of follow-up on vibe driven development and JavaScript not causing The Great Divide, writing testing automation, global design systems and web components, could PHP be used for web components, what if view transitions are going to be everywhere, and frontend engineer vs design systems engineer job titles and descriptions.




web

633: Thomas Steiner on AI in Chrome and the Web

Thomas Steiner from Project Fugu talks with us about AI in Chrome, the small large language model in use, how features like this are rolled out, the ethics and concerns around sending and sharing data, on device vs web APIs, and ideas for use cases and ways to explore AI on the web.




web

636: W Hot Drama Week (WordPress, WP Engine, and Web Components – Oh My!)

We're getting some feelings out about WordPress and Matt Mullenweg vs WP Engine drama, as well as the Web Components conversation that happened this past week.




web

640: Navigating the Pros and Cons of Web Components

Riffing off a Dave Rupert blog post, Chris and Dave talk through the pros and cons of web components, when to use them, when it's a bad idea to use them, what would it take to make the Next.js of web components, and how long until we don't need anymore frameworks?




web

Webcams &amp; borewells

A groundwater expert uses a webcam to find out the yield from borewells and identify problems. By





web

Gift Giving to the World (Wide Web)

Frances Berriman asks us to give the gift of consideration to those who are using the web on constricted devices such as low-end smart phones or feature phones. Christmas is a time of good will to all, and as Bugsy Malone reminds us, you give a little love and it all comes back to you.


If I was given the job of Father Christmas with all my human limitations, apparently it would take me something like 6 months at non-stop full speed to deliver gifts to every kid on the planet. The real Father Christmas has the luxury of magic when it comes to delivering millions of gifts in just one night, but the only magical platform at my disposal is the world wide web, so I propose switching to digital gift cards and saving the reindeer feed.

300 million people are set to come online for the very first time in 2020, and a majority of those will be doing so via mobile phones (smart- and feature-phones). If we want those new users to have a great time online, spending those gift cards, we need to start thinking about their needs and limitations.

Suit up

We might not be hopping on the sleigh for these deliveries, but let’s suit up for the journey and get the tools we need to start testing and checking how our online gift-receivers will be enjoying their online shopping experience.

Of course, the variety of phones and OSs out there is huge and varied, but we have a few options out there to get a sense for the median. Here’s a few suggestions on where to start:

  • Never has there been a better time to advocate at your workplace for a device testing suite or lab.
  • You can also just pick up a low-end phone for a few bucks and spend some real time using it and getting a sense for how it feels to live with it every day. May I suggest the Nokia 2 or the Moto E6 - both very representative devices of the sort our new visitors will be on.
  • You’ve also got WebPageTest.org at your disposal, where you can emulate various phones and see your sites rendered in real-time to get a sense of what an experience may look like for your users.
  • You’ll also want to set yourself some goals. A performance budget, for example, is a good way to know if the code you’re shipping hits the mark in a more programmatic way.

Gift wrap

Many of us began our internet lives on desktop machines, and thanks to Moore’s law, these machines have been getting ever more powerful every year with more CPUs and memory at our disposal. The mobile phone landscape somewhat resets us on what hardware capacity is available on the client-side of our code, so it’s time to lighten the load.

What we see in the landscape of phones today is a huge spread of capabilities and CPU speeds, storage capacity and memory. And the gap between the haves and the have-nots is widening, so we have a huge task to deal with in meeting the needs of such a varied audience.

As far as possible, we should try to:

  • Keep processing off the client - do anything you can server-side. Consider a server-side render (hold the <script>, thanks) for anything relatively static (including cached frequent queries and results) to keep client-side JavaScript to the minimum. This way you’re spending your CPU, not the user’s.
  • Avoid sending everything you have to to the end user. Mobile-first access also means data-plan-first access for many, which means they may be literally paying in cold-hard cash for everything you send over the wire – or may be experiencing your site over a degraded “4G” connection towards the end of the month.
  • Aggressively cache assets to prevent re-downloading anything you’ve sent before. Don’t make the user pay twice if they don’t have to.
  • Progressively load additional assets and information as the user requests them, rather than a big upfront payload, that way you’re giving the end user a little more choice about whether they want or need that extra data set.

This is all to say that as web developers, we have a lot more control over how and when we deliver the meat of our products - unlike native apps that generally send the whole experience down as one multi-megabyte download that our 4G and data-strapped users can’t afford.

Make a wish

Finally, it’s time for your gift recipients to go out onto the web and find whatever their greatest wish is. For many, that’s going to begin when they first turn on their phone and see all those enticing icons on their home screen. Opening a browser may not be their first port of call.

They’ll be primed to look for sites and information through the icon-heavy menu that most mobile OSs use today, and they will be encouraged to find new experiences through the provided app store interface.

The good news is that web experience can be found in many modern app-stores today.

For example, if you build an app using Trusted Web Activities, the Google Play Store will list your web site right alongside native apps and allow users to install them on their phones. Samsung and Microsoft have similar options without the extra step of creating a TWA - they’ll list any Progressive Web App in their stores. Tools like Microsoft’s PWA Builder and Llama Pack are making this easier than ever.

If your users are primed to search for new experiences via a search engine instead, then they’ll benefit from the work you’ve put in to list them in app stores regardless, as PWAs are first and foremost about making websites mobile-friendly, regardless of point of sale. A PWA will provide them with offline support, service works, notifications and much more.

We do have a grinch in this story, however.

Apple’s iOS explicitly does not allow your website to be listed in their app store, so sadly you’ll have a harder time reaching those users. But it is possible! Fortunately, iOS isn’t as all-dominating world wide as it is in the tech community, selling only around 10-15% of smartphones out in the world.

The best present

The WWW is a wonderful gift that we received over 30 years ago and, as web developers, we get to steward and share this truly global, open, platform with millions of people every day. Let’s take care of it by building and sharing experiences that truly meet the needs of everyone.


About the author

Frances Berriman is a San Francisco-based British-born designer and web developer who blogs at fberriman.com. She’s done all sorts of things, but has a special soft spot for public sector projects, and has worked for the Government Digital Service, building GOV.UK, Code for America, Nature Publishing and the BBC and is currently Head of UX and Product Design at Netlify.

More articles by Frances




web

Game theory [electronic resource] : decision, interaction, and evolution / James N. Webb

[London] : Springer, [2007]




web

‘Dayaa’ web series review: A solid thriller drama series that has scope for a brilliant follow-up 

Director Pavan Sadineni’s Telugu web series ‘Dayaa’, headlined by J D Chakravarthy’s restrained brilliance, is a gritty series that can leave viewers curious to know more




web

HMWS&SB website taken down after reported hacking

Hyderabad Water Board website down for maintenance after redirection to unauthorised sites, no impact on regular services




web

Web address for Science Portal in progress

The URL for the mock-up Leila showed last Tuesday is:
lnadams.org/msl.htm

Please remember that this is just a design layout, the links do not work, and it is subject to extreme change.

Comments are highly encouraged! Please post to this blog or email Sara or Joe.




web

Combining forces, GSAP & Webflow!

Change can certainly be scary whenever a beloved, independent software library becomes a part of a larger organization. I’m feeling a bit more excitement than concern this time around, though.

If you haven’t heard, GSAP (GreenSock Animation Platform) is teaming


Combining forces, GSAP & Webflow! originally published on CSS-Tricks, which is part of the DigitalOcean family. You should get the newsletter.