Mostrando entradas con la etiqueta internet. Mostrar todas las entradas
Mostrando entradas con la etiqueta internet. Mostrar todas las entradas

29/11/10

How to Deploy HTTPS Correctly

Chris Palmer, 15 Nov 2010

Internet technologists have long known that HTTP is insecure, causing many risks to users. The release of Firesheep made one of these risks concrete and obvious to even non-technical folks.

While HTTPS has long existed as a reasonable way to improve web security, web operators have been slow to host their applications with it. In part, this is because correctly and completely hosting an application with HTTPS takes some care.

This article is designed to help web operators get a conceptual handle on how to protect their users with HTTPS. Taking a little bit of care to protect your users is a reasonable thing for web application providers to do, and a good thing for users to demand.

Background
HTTPS provides three security guarantees:
1.Server authentication allows the browser and the user to have some confidence that they are talking to the true application server. Without this guarantee, there can be no guarantee of confidentiality or integrity.
2.Data confidentiality means that eavesdroppers cannot understand the communications between the user’s browser and the web server, because the data is encrypted.
3.Data integrity means that a network attacker cannot damage or alter the content of the communications between the user’s browser and the web server, because they are validated with a cryptographic message authentication code.
HTTP provides no security guarantees, and applications that use it cannot possibly provide users any security. When using a web application hosted via HTTP, people have no way of knowing whether or not they are talking to the true application server, nor can they be sure attackers have not read or modified communications between the user’s computer and the server.

Modes of Attack and Defense
However users connect to the Internet, there are a variety of people who can attack them — whether spying on them, impersonating them, tampering with their communications, or all three of these. The wifi network operator can do this; any ISP in the path between client and server can do it; anyone who can reconfigure the wifi router or another router can do it; and often, anyone else using the same network can do it, too.

Firesheep is a passive network attack: it eavesdrops on the contents of network communications between browser and server, but does not re-route or modify them.

By contrast, other freely-available tools perform active network attacks, in which the attacker does modify the contents of and/or re-route communications. These tools range from serious, such as sslstrip, to silly, like the Upside-Down-Ternet. Although Upside-Down-Ternet is a funny prank, it is technically identical to potentially more damaging attacks such as an attack that injects malicious code or incorrect information into web pages; at the same time, it shows that such attacks are easy enough to be jokes. Free wifi hotspots have been known to inject advertisements dynamically into web pages that users read — indicating that active network attacks are a viable business model. Tools like Cain and Abel enable a range of attacks, including re-routing local network traffic through the attacker's system. (Also see Arpspoof and dsniff.)

Only a mechanism that provides (at least) authentication, confidentiality, and integrity can defend against the full range of both passive and active attacks. HTTPS is currently our best option for web applications.

However, there are some potential pitfalls that site operators must avoid.

Mixed Content
When hosting an application over HTTPS, there can be no mixed content; that is, all content in the page must be fetched via HTTPS. It is common to see partial HTTPS support on sites, in which the main pages are fetched via HTTPS but some or all of the media elements, stylesheets, and JavaScript in the page are fetched via HTTP.

This is unsafe because although the main page load is protected against active and passive network attack, none of the other resources are. If a page loads some JavaScript or CSS code via HTTP, an attacker can provide a false, malicious code file and take over the page’s DOM once it loads. Then, the user would be back to a situation of having no security. This is why all mainstream browsers warn users about pages that load mixed content. Nor is it safe to reference images via HTTP: What if the attacker swapped the Save Message and Delete Message icons in a webmail app?

You must serve the entire application domain over HTTPS. Redirect HTTP requests with HTTP 301 or 302 responses to the equivalent HTTPS resource.

Some site operators provide only the login page over HTTPS, on the theory that only the user’s password is sensitive. These sites’ users are vulnerable to passive and active attack.

Security and Cookies
As I described in a paper on secure session management for web applications, site operators must scope sensitive cookies (such as cookies used for user authentication) to the secure origin. If a cookie is broadly scoped (with the Domain attribute in the Set-Cookie: header), it may “leak” to other hosts or applications in the same domain — potentially less-secure hosts or applications.

Similarly, the application must set the Secure attribute on the cookie when setting it. This attribute instructs the browser to send the cookie only over secure (HTTPS) transport, never insecure (HTTP).

Use Strict Transport Security
Strict Transport Security (HSTS) is an HTTP protocol extension that enables site operators to instruct browsers to expect the site to use HTTPS.

Although not all browsers yet support HSTS, EFF urges those that don’t — we’re looking especially at you, Apple and Microsoft — to follow the lead Google and Mozilla have set by adopting this useful security mechanism. Indeed, ultimately we expect HTTPS (and possibly SPDY) to replace HTTP entirely, the way SSH replaced Telnet and rsh.

We recently enabled HSTS for eff.org. It took less than an hour to set up, and we found a way to do it without forcibly redirecting users to HTTPS, so we can state an unequivocal preference for HTTPS access while still making the site available in HTTP. It worked like a charm and a significant fraction of our users are now automatically accessing our site in HTTPS, perhaps without even knowing it.

Performance Concerns
Many site operators report that they can’t move to HTTPS for performance reasons. However, most people who say this have not actually measured any performance loss, may not have measured performance at all, and have not profiled and optimized their site’s behavior. Usually, sites have latency far higher and/or throughput far lower than necessary even when hosting over HTTP — indicating HTTPS is not the problem.

The crux of the performance problem is usually at the content layer, and also often at the database layer. Web applications are fundamentally I/O-bound, after all. Consider this wisdom from the Gmail developers:

First, we listed every transaction between the web browser and Google’s servers, starting with the moment the “Sign in” button is pressed. To do this, we used a lot of different web development tools, like Httpwatch, WireShark, and Fiddler, plus our own performance measuring systems. [...]

We spent hours poring over these traces to see exactly what was happening between the browser and Gmail during the sign-in sequence, and we found that there were between fourteen and twenty-four HTTP requests required to load an inbox and display it. To put these numbers in perspective, a popular network news site’s home page required about a 180 requests to fully load when I checked it yesterday. But when we examined our requests, we realized that we could do better. We decided to attack the problem from several directions at once: reduce the number of overall requests, make more of the requests cacheable by the browser, and reduce the overhead of each request.

We made good progress on every front. We reduced the weight of each request itself by eliminating or narrowing the scope of some of our cookies. We made sure that all our images were cacheable by the browser, and we consolidated small icon images into single meta-images, a technique known as spriting. We combined several requests into a single combined request and response. The result is that it now takes as few as four requests from the click of the “Sign in” button to the display of your inbox.

Google’s Adam Langley provides additional detail:
In order to do this we had to deploy no additional machines and no special hardware. On our production frontend machines, SSL/TLS accounts for less than 1% of the CPU load, less than 10KB of memory per connection and less than 2% of network overhead. Many people believe that SSL takes a lot of CPU time and we hope the above numbers (public for the first time) will help to dispel that. [emphasis in original]

Is it any wonder Gmail performs well, even when using HTTPS exclusively? Site operators can realize incremental improvement by gradually tuning their web applications. I gave a presentation to this effect at Web 2.0 Expo 2009.

Conclusion
HTTPS provides the baseline of safety for web application users, and there is no performance- or cost-based reason to stick with HTTP. Web application providers undermine their business models when, by continuing to use HTTP, they enable a wide range of attackers anywhere on the internet to compromise users’ information.

More to Come
Keep an eye out for Part Two of this whitepaper, which will go into more detail about how site operators can easily and incrementally improve site efficiency, thus enabling the move to HTTPS. Sphere: Related Content

12/8/10

The Future of the Internet

“In only a few short years, electronic computing systems have been invented and improved at a tremendous rate. But computers did not ‘just grow.’ They have evolved… They were born and they are being improved as a consequence of man’s ingenuity, his imagination… and his mathematics.” — 1958 IBM brochure

The Internet is a medium that is evolving at breakneck speed. It’s a wild organism of sweeping cultural change — one that leaves the carcasses of dead media forms in its sizeable wake. It’s transformative: it has transformed the vast globe into a ‘global village’ and it has drawn human communication away from print-based media and into a post-Gutenberg digital era. Right now, its perils are equal to its potential. The debate over ‘net neutrality’ is at a fever pitch. There is a tug-of-war going on between an ‘open web’ and a more governed form of the web (like the Apple-approved apps on the iPad/iPhone) that has more security but less freedom.
So what’s the next step in its evolution, and what’s the big picture? What does the Internet mean as an extension of human communication, of the human mind? And forget tomorrow — where will the web be in fifty years, or a hundred? Will the Internet help make the world look like something out of Blade Runner or Minority Report? Let’s just pray it doesn’t have anything to do with The Matrix sequels, because those movies really sucked.

This article will offer in-depth analysis of a range of subjects — from realistic expectations stemming from current trends to some more imaginative speculations on the distant future.

[By the way, did you know we have a free Email Newsletter? Subscribe now and get fresh short tips and tricks in your inbox!]

Security
“Death of the Open Web”?

Those words have an ominous ring for those of us who have a deep appreciation of the Internet as well as high hopes for its future. The phrase comes from the title of a recent New York Times article that struck a nerve with some readers. The article paints a disquieting picture of the web as a “haphazardly planned” digital city where “malware and spam have turned living conditions in many quarters unsafe and unsanitary.”

There is a growing sentiment that the open web is a fundamentally dangerous place. Recent waves of hacked WordPress sites revealed exploited PHP vulnerabilities and affected dozens of well-known designers and bloggers like Chris Pearson. The tools used by those with malicious intent evolve just as quickly as the rest of the web. It’s deeply saddening to hear that, according to Jonathan Zittrain, some web users have stooped so low as to set up ‘Captcha sweatshops’ where (very) low-paid people are employed to solve Captcha security technology for malicious purposes all day. This is the part where I weep for the inherent sadness of mankind.

“If we don’t do something about this,” says Jonathan Zittrain of the insecure web, “I see the end of much of the generative aspect of the technologies that we now take for granted.” Zittrain is a professor of Internet governance and regulation at Oxford University and the author of The Future of the Internet: and How to Stop It; watch his riveting Google Talk on these subjects.
The result of the Internet’s vulnerability is a generation of Internet-centric products — like the iPad, the Tivo and the XBOX — that are not easily modified by anyone except their vendors and their approved partners. These products do not allow unapproved third-party code (such as the kind that could be used to install a virus) to run on them, and are therefore more reliable than some areas of the web. Increased security often means restricted or censored content — and even worse — limited freedoms that could impede the style of innovation that propels the evolution of the Internet, and therefore, our digital future.

The web of 2010 is a place where a 17 year-old high school student can have an idea for a website, program it in three days, and quickly turn it into a social networking craze used by millions (that student’s name is Andrey Ternovskiy and he invented Chatroulette). That’s innovation in a nutshell. It’s a charming story and a compelling use of the web’s creative freedoms. If the security risks of the Internet kill the ‘open web’ and turn your average web experience into one that is governed by Apple or another proprietary company, the Andrey Ternovskiys of the world may never get their chance to innovate.

Security Solutions
We champion innovation on the Internet and it’s going to require innovation to steer it in the right direction. Jonathan Zittrain says that he hopes we can come together on agreements for regulating the open web so that we don’t “feel that we have to lock down our technologies in order to save our future.”

According to Vint Cerf, vice president and Chief Internet Evangelist at Google, “I think we’re going to end up looking for international agreements – maybe even treaties of some kind – in which certain classes of behavior are uniformly considered inappropriate.”

Perhaps the future of the Internet involves social structures of web users who collaborate on solutions to online security issues. Perhaps companies like Google and Apple will team up with international governmental bodies to form an international online security council. Or maybe the innovative spirit of the web could mean that an independent, democratic group of digital security experts, designers, and programmers will form a grassroots-level organization that rises to prominence while fighting hackers, innovating on security technology, writing manifestos for online behavior, and setting an example through positive and supportive life online.

Many people are fighting to ensure your ability to have your voice heard online — so use that voice to participate in the debate, stay informed, and demand a positive future. Concerned netizens and Smashing readers: unite!

Freedom
Net Neutrality

Some believe that the fate of the Internet has been up for grabs ever since the federal government stopped enforcing ‘network neutrality’ rules in the mid-2000’s. In a nutshell, net neutrality means equality among the information that travels to your computer: everyone has the right to build a website that is just as public, affordable, and easily accessible as any other. However, some companies like phone and internet service providers are proposing ‘pay tiers’ (web service where you need to pay premium fees in order to allow visitors to access your site quickly). These tiers of web service could kill net neutrality by allowing those who can afford premium service (particularly large media companies who don’t like sharing their audience with your blog) greater access to consumers than the average web user.

The debate over net neutrality reached a boiling point when Google and Verizon announced a ‘joint policy proposal for an open Internet’ on August 9th, 2010. Despite the proposal’s call for a “new, enforceable prohibition against discriminatory practices” amongst online content, many criticized it, citing leniency and loopholes.

Net neutrality needs to be made law. If the Internet were to have a slow lane and a fast lane, your average web user could lose many of his or her freedoms and opportunities online, thereby shattering the core values that make the Internet so profoundly valuable to society. However, that’s just the tip of the iceberg for this thorny issue. To learn more, read the full proposal or watch the Bill Moyers episode ‘The Net @ Risk.’

The World into the Web
Browser-based Everything

Google is developing a variety of applications and programs that exist entirely within the browser. Their PAC-MAN game was a preview of what’s to come because it allowed in-browser play of a simple, lightweight video game that required no downloads and relied on pure HTML, CSS, and Javascript. At the company’s 2010 I/O conference, Google laid out its plans to develop “rich multimedia applications that operate within the browser” (according to this New York Times report on the conference). The company plans to sell in-browser web applications like photo editing software (imagine using a Photoshop equivalent entirely within the browser) that it will sell in a web applications store called the Chrome Web Store.

If our programs and applications are about to be folded into the browser, what will exist within the browser in ten years? Currency? Education? Consciousness? Personally, I’m hopeful that my browser will be able to produce piping hot cheeseburgers sometime soon.
The Internet as a Collective Consciousness

The Internet is a medium, and philosopher Marshall McLuhan believed that all media are extensions of the human senses. The engine of our collective creative efforts is the force that’s causing the web to evolve more rapidly than any biological organism ever has.
The Internet is an extension of the collective human mind and it’s evolving into a medium of transcendence. By constructing a place where the collective human consciousness is both centralized in one location (on your computer) and globally accessible (for those with the means to reach or use a computer, that is), our human spirit is transcending the human body. Way back in 1964, McLuhan himself wondered, “might not our current translation of our entire lives into the spiritual form of information seem to make of the entire globe, and of the human family, a single consciousness?”

With the advent of trends including social media, ‘lifecasting,’ and ‘mindcasting,’ the Internet is being used as a real-time portal for the human consciousness. Perhaps those trends will be inverted by some web users of the future: instead of bringing offline life to the web (as so-called ‘lifecasters’ do when they stream live video of their attendance at an offline event), some web users will live their entire public lives online. Imagine a pop star who conducts her entire career online: every interview, live performance, music video or album release conducted solely through a browser window or mobile screen. Or a media theorist who exploited the platform of the web while discussing the theoretical ramifications of those actions. It’d be a great gimmick.

The Web into the World
The ‘Web of Things’

The ‘web of things’ or ‘Internet of things’ is a concept that will be a reality (at least in a rudimentary form) in the near future. The idea is that devices, appliances, and even your pets can all be tracked online. With Google Maps for iPhone, you can currently track your location on a digital map in relation to the streets and landmarks in your area. So it’s not hard to imagine a world where you can zoom in on your location and see detailed, 3D renderings of your surroundings: the cars on your block, the coffee machine in your kitchen, even Rover running around in your backyard! And it’s a good thing that you’re digitally tracking the location of poor Rover; he’s liable to hop the fence and make a run for it now that you’ve created a satellite computer out of everything you own (including his dog collar) by attaching a tracking device to it.

AT&T is betting big on the web of things. According to this Reuters article, the phone service provider is investing in tracking devices that could be installed in cars, on dog collars, and on the pallets used to move large shipments of products. The dog collar, for example, “could send text messages or emails to the owner of a pet when it strays outside a certain area, or the device could allow continuous tracking of the pet.”

Combine the concept of the ‘web of things’ with Second Life-style 3D imaging and you can imagine a web-based duplicate world — a virtual world that corresponds to the real one. But what are the implications of a world where every physical item has a corresponding digital existence online? Can we track the physical effects of climate change in the web of things? Will there be a digital avatar for every pelican carcass in the vicinity of the oil spill that’s devastating the Gulf of Mexico? It’s a tragic shame to develop a virtual world if we let the natural one go to waste in the meantime.

Interactive Landscapes

It has been said that today’s science fiction is tomorrow’s reality. Unfortunately, most good science fiction stories are cautionary tales set in dystopian nightmares.
Simon Mainwaring reports on the N building in Japan, where “the whole building facade has been transformed into a real time dialogue between smart phones and what’s going on inside the store.” The exterior of the building is layered with QR codes (an alternate form of bar code) that can deliver real-time information to your phone. In Stephen Spielberg’s film Minority Report (adapted from a short story by mad genius Philip K. Dick), Gap ads came alive to hawk khakis to Tom Cruise. Looks like we’re about one step away from this scenario.

Mr. Mainwaring imagines a future with “billboards that watch you shop and make targeted suggestions based on your age, location and past buying habits,” and “stores will effectively be turned inside out as dialogue and personalized interaction with customers begins outside the store.”
The technology is cool, but it sounds like a pretty annoying future if you ask me. Who wants to be accosted by a holographic salesperson? The web grants us a great opportunity to use our collective voices to speak out on topics that matter to us. Because there are no regulations yet for much of this technology, it may be up to concerned citizens to make themselves heard if Internet-based technology is used in intrusive or abrasive ways.
The ‘Innerweb’

Cyborgs are among us already — humans whose physical abilities have been extended or altered by mechanical elements built into the body (people who live with pacemakers are one example). What will happen when the Internet becomes available on a device that is biologically installed in a human? What will the first internal user interfaces look like?

Here’s one speculation.

In the near future, we may be capable of installing the Internet directly into the user’s field of vision via a tiny computer chip implanted into the eye. Sound far-fetched? I doubt that it would sound far-fetched for Barbara Campbell, whose sight has been partially restored by a digital retinal implant (CNN reports on Barbara’s artificial retina).

Ms. Campbell was blind for many years until she had a small microchip surgically implanted in her eye. A rudimentary image of Ms.Campbell’s surroundings is transmitted to the device, which stimulates cells in her retina, in turn transmitting a signal to her brain. It’s a miracle that the development of a bionic eye has begun to help the blind see.

How else might doctors and scientists take advantage of the internal microchip? Perhaps the user’s vision will be augmented with an Internet-based interface with capabilities including geolocation or object identification. Imagine if technology like Google Goggles (a web-based application that identifies images from landmarks to book covers) was applied inside that interface. The act of seeing could not only be restored but augmented; a user might be capable of viewing a landscape while simultaneously identifying web-based information about it or even searching it for physical objects not visible to the naked eye. Apply the concept of augmented sight with the idea of the ‘web of things’ — an environment where physical objects have a corresponding presence on the web — and you can imagine a world where missing people are located, theft is dramatically reduced, the blind can see, and ’seeing’ itself means something more than it used to.

If the web is an extension of our senses, it follows suit that the web may be capable of modifying those senses or even accelerating their evolution.

The Crown Jewels
“The next Bill Gates will be the deliverer of a highly technological solution to some of our climate change challenges.” — Lord Digby Jones of Birmingham

In preparation for this article, I considered a variety of wild ideas and fun speculations about the future. Could the Internet be used to solve the problem of climate change, generate tangible matter, or contact extraterrestrial life? Maybe those ideas sound like the stuff of imaginative fiction, but in a world where quantum teleportation has been achieved and researchers have created a living, synthetic cell, it almost seems as if the concept of science fiction is being eradicated while real technology brings our wildest fantasies to life. Here is the result of my most daring (absurd?) speculation.

Time Travel
I called on physics teacher Mark Stratil to answer my last burning question: could the Internet ever be capable of facilitating the development of time travel? Here’s Mark’s answer:

“The Internet is still based on computers, which make linear calculations. Right now, all computers are based on binary code, which is a series of yes and no questions. You can make something that’s incredibly complex with a series of yes and no questions, but it takes a certain amount of time. The Internet still has to go through those calculations and it still physically has to make one calculation that will lead to the next calculation that will lead to the next. So no matter how fast we can get our computers – they’re making billions of calculations, trillions of calculations per second – there’s still going to be some lag time. They’re still limited by time in that way. They still need some time to make that conversation or that calculation.

In that way, they’re kind of chained to time. Their whole existence is based on a linear sequence of things that happen. In order to create something else, something that goes outside of time, you would have to make it a non-linear system — something that that’s not based on a series of yes and no questions, because those have to be answered in a precise order. It would have to be some kind of system that was answering all the questions at once.”

So Mark’s short answer to my fundamental question was basically that the Internet, in its current state, would not be capable of facilitating time travel. However, if the Internet was liberated from the linear structure of binary code and migrated onto an operating system that ‘answered all questions at once,’ then maybe it could have the potential to manipulate time or transcend the boundaries of time.

Sounds unlikely at this point, but one of the Internet’s greatest capabilities is the opportunity to share and develop ideas like these!

Conclusion
Responsible Evolution
Through technology, we hold the reins to our own evolution.

For the first time in history, it might be said that there are moral implications in the act of evolution. The Internet is an extension of our senses and our minds, and its progress is propelled by our own creative and intellectual efforts. The future of the Internet will be shaped by millions of choices and decisions by people from all walks of life. Designers and programmers like us have the advantage of technical skill and specialized knowledge. Given the increasing presence of the Internet in our lives, our choices can have deep reverberations in human society.

We’ll face small choices like what color to use for a button and larger choices like which platforms to endorse and which clients to support with our work. But the real questions form broad patterns behind every media trend and every mini technological revolution. Can we use technology to develop solutions to environmental problems — or will we abandon the natural world in favor of a digital one and the ‘web of things’? Have we fully considered what it means to merge biology and technology? And finally, do we really need a digital tracking device on our coffee machines?

What a thrilling time to be alive! Let’s proceed with great enthusiasm and a commitment to designing a future that is meaningful, peaceful, and staggeringly exciting.

Partial Bibliography
* New York Times Magazine: “The Death of the Open Web”
* Simon Mainwaring: “The Future of Shopping: What Happens When Walls Start Talking”
* The Gutenberg Parenthesis: Parallels Between Pre-Printing Communication and the Digital Era
* New York Times: “Google Pitches a Web-Centric Future”
* Google Talks: The Future of the Internet
* Moyers on America: ‘The Net @ Risk’ (Video)
* Pew Internet Research Poll: Future of the Internet IV
* Google I/O: The Web is Killing Radio, Newspapers, Magazines, and TV (TechChrunch reports)

Related Posts
You may be interested in the following related posts:
* The Evolution of the Logo
* The Dying Art Of Design
* Lessons From Swiss Style Graphic Design
* Art Manifestos and Their Applications in Contemporary Design Sphere: Related Content