Or am I the only one remembering this opinion? I felt like it was common for people to say that the internet couldn’t be taken down, or censored or whatever. This has obviously been proven false with the Great Firewall of China, and of Russia’s latest attempts of completely disconnecting from the global internet. Where did this idea come from?
Because that was ours purpose in the first place.
The basic building blocks of the internet were designed by DARPA, and it was designed with that military mentality of “If the ruskies nuke any part of our infrastructure, the rest of it should keep running.” You can chop large parts of the internet off and those parts stop working but the rest of it keeps going. Here’s an extreme example: I can unplug my cable modem and disconnect my house from the internet completely, yet I can still access the web pages hosted by my switch, Wi-Fi router and NAS through my local area network.
Mind you that a lotmof that no longer works
In the past traffic could be routed over whatever. If one node went down, the traffic would go over another
Now we have a few very fast backbones and if even one goes down bye bye internet
What you have cached locally or on your doesn’t count because it’s only that which you’ve seen before.
Because its decentralized. So you can take a part of it out but not the whole thing. Unfortunately in some ways its become centralized.
If Russia disconnects, or get disconnected, then they’re just not part of the internet anymore. The internet itself will continue to exist though - and probably would be quite a bit better.
I honestly hope they do. The amount of disinformation come election time would drastically drop.
“The internet sees censorship as damage and routes around it.”
From a very primitive perspective, this is true. Many of the infrastructure protocols (DNS, BGP, etc.) that the internet sits on are designed to be resilient and fault tolerant. Block access to a DNS server, and the system will find another one. Usually. Depending on circumstances.
Firewalling an entire country is incredibly difficult. From a technical point of view, the GFoC is only modestly successful. It blocks casual and accidental access to the ‘outside world’ just fine, but for the determined operator there are absolutely ways around it - VPNs, cellular networks, satellite relays, you name it.
But do you want to risk having the police show up at your door with orders to kill on sight?
This is fundamentally no different than content filtering in a typical office. From my work computer, I can’t get to porn sites. If I really wanted to, I could find a way - but the odds are pretty good that HR would be at my desk with termination papers and a security escort out of the building.
The internet couldn’t and still can’t be taken down - but countries can certainly restrict it within their locale (though it is insanely difficult).
The opinion is that the internet as a concept and set of protocols was and is too widespread to ever fully dismantle and one dude with a mission can capture and preserve an immense amount of data.
That’s still all true but doesn’t hold for social media walled gardens which have come to control a huge proportion of communication.
I’m confused. You’re citing the actions of a country to impact its own Internet as evidence they can take the Internet down?
That’s like saying me disconnecting my microwave proves that I can take down the power grid.
disconnecting my microwave proves that I can take down the power grid
DO NOT DISCONNECT YOUR MICROWAVE!
now i find out why the power went out today
The internet was originally designed to withstand nuclear war, so that a functioning military network could coordinate a retaliation quickly.
The network protocols themselves are self-healing, routing around failures, very resilient.
The internet itself, even today, is incredibly difficult to destroy. It is nearly impossible to take it down.
However, the internet that most people think of as the internet, Facebook Google etc. Are centralized services that are trivial to take down.
Peer to peer protocols like email, torrents, are also nearly impossible to take down.
The examples of Russia and China isolating themselves, are different. That’s the network designers isolating the network. It’s not a third party trying to destroy the network.
Wait, email is p2p?
Yes, mostly. It’s distributed and federated. Peer to peer at the email server level
Domain A users can message domain B users directly without going to any other domain.
Fun fact email can also handle variable availability networks and use forwarding agents to get a message through even indirectly (though most people don’t configure this anymore, in the days of dialup this was more common)
I see. Thanks for clarifying!
China’s Internet is basically just a vpn
Well, its not virtual. so its a PN
It’s not private either, so an N
But it’s local to China, so it’s a LAN.
Chins is locally wide, so it’s a LWAN
This opinion remains largely correct - the Internet as a network is very difficult to take down.
However things have happened that have undermined the Internet in favour of commercial priorities.
Net Neutrality was a major principle of the Internet but that is under attack, particularly in the US, where infrastructure providers want to maximise profit by linking their income to each Gb used rather than just paid as a utility. Their costs are largely fixed in infrastructure but they push the lie that they need to be paid for how busy that infrastructure is. A network router doesn’t care whether it’s transferring 1gb or 10gb, it only matters if you hit capacity and the network needs to be expanded. The Internet providers instead want profit profit profit so are pushing for a way to maximise it.
The other major issue has been consolidation and that’s thanks to monopolies being allowed to form and dictate how the Internet works. Google, Facebook, Microsoft, Amazon and Apple - they’ve all used their services to try to manipulate customers into their walled gardens and prevent competition.
So the Internet as many people think of it is very vulnerable - big centralised services can have outages that affect everyone because people don’t have much choice.
But the reality is the underlying protocols and infrastructure remains robust. Google might have an outage, but the Web itself is still functional. Email protocols and file transfer protocols still work. The problem is people who are sitting in Googles walled garden of services are locked out of everything. And with Googles huge monopoly on search and advertising it means lots of other major services are out too.
So the Internet itself is fine. It’s the services and monopolies built on it thay are the problem.
Net Neutrality was a major principle of the Internet but that is under attack, particularly in the US[…]
Not really focused on the US. Every nation, every corporation, every venal special interest group is fighting against net neutrality.
To expand upon “walled gardens”, the customers are not just you and I, it includes the majority of the Internet since they’re all running on the cloud, a.k.a., AWS, MS Azure, and Google Cloud.
A big change between the internet in the 90’s/00’s and today, is that today we don’t really have this internet with “all computer being equal”, we have a dozen of facebook/google/reddit/tiktok massive websites, and it’s relatively easy to close one of these.
in the 90’s a judge could ask an ISP to close the homepage of someone without impacting the whole internet
yeah i miss geocities too
Authoritarian regimes must control the flow of information if they want to continue to exist. Just because they can disconnect themselves from the rest of the world doesn’t mean they’ve "taken down the Internet. "
Because then it was a robust network with a myriad websites and not just those four websites linking to each other. Also, they weren’t all dependant on adsense or akamai to function.
all tracert point to akamai like internet rome
A 1993 Time Magazine article quotes computer scientist John Gilmore, one of the founders of the Electronic Frontier Foundation, as saying “The Net interprets censorship as damage and routes around it.”[7]
That applied a whole lot more when most connections were using a phone line, and a decent size city could have hundreds of ISPs. But part of the design of a redundant mesh network is that there are tons of different paths to any destination. Cutting any of those links would simply force traffic to other routes.
The early Internet was decentralized in other ways, too. Rather than flock to corporate platforms like Facebook, people spent a lot of time on federated and independent platforms. This included Usenet, IRC, and BBSes. In the event that the feds, lawyers, etc could take one down, a dozen more could spring up overnight. There was such a small barrier to entry, and many were run by hobbyists.
It’s somewhat true today. There are countless Lemmy instances that are completely independent. Pirate Bay famously references the Hydra, and it applies to their peers as well. But these are limited in scope.
Xitter has shown us just how quickly and thoroughly a platform can collapse through hostile admins, and how slowly people will reject it.
I moan about it regularly but this…
Rather than flock to corporate platforms like Facebook, people spent a lot of time on federated and independent platforms. This included Usenet, IRC, and BBSes
Is just tragic isn’t it? We really had it. A global free flow of hobbies, interests, research, debate, exploration.
I don’t know what’s so fundamentally flawed about human nature a) that something that started so well like facebook gets enshitified to the extent that it has and b) why people flock to it like flies round a steaming turd
The answer to your second point is simple.
Meta’s properties (FB, Insta) have something that most other social networks are lacking: A network of real-world family and friends.
Twitter, Reddit, Mastodon, Lemmy, Tiktok, and the rest all tend to have communities built from the platform’s population, based on shared interests. Meanwhile, FB is the platform that you use to connect with your oddball uncle and high school friends from way back. That’s the sunk cost that makes it so much harder to leave than the strangers on reddit who share your love of lime jello.
That’s a big part of the appeal of the fediverse for me. Setting up a personal site used to be fairly easy, but was largely isolated and unidirectional. With the AP protocol, and frankly a lot of self-hostable apps in general these days, you can make something to converse with the whole globe and you don’t even need to make a big effort to help people find it.
Webrings still exist, but finding them is less than trivial when they get drowned out by the noise of corporate sites. I’ve used IRC within the last year, but had to look up the proper use of nicserve commands. The old web mentality is still out there, but for the major part people want simplicity. Few want to go through the learning curve to deal with some of the more esoteric parts of it when they can just auth into a site and do a thing.
It’s a truism in writing; villains act and heroes react. If someone looks at the internet and sees a way to exploit it they will. They don’t care that it’s working fine for everyone else; they want the money.
“The Net interprets censorship as damage and routes around it.”
As an example of this, one of the easiest and most performant methods a nation has of blocking a website is dictating which DNS records its ISPs return for domains.
This has the advantages that it doesn’t require traffic inspection and doesn’t slow traffic at all.
But it has the disadvantages that it has an all-or-none effect on the domain e.g. it can’t be used to bock specific pages.
It can also be bypassed by simply using an international DNS server. There are people bypassing this kind of censorship without even knowing they are doing so.
over the years i think the internet has proved to be a bit flaky
BGP https://en.wikipedia.org/wiki/Border_Gateway_Protocol has proved to be non-trivial and mistakes have been made both accidentally and maliciously that have broken large chunks of the internet
having said that i think that things like P2P protocols - Kademlia, Gnutella, UDP bittorent, TOR etc have proved to be very resilient and hard or impossible to break despite concerted efforts to do so. these protocols have adapted to hide using VPN, HTTP and other tunneling techniques and services have distributed themselves effectively such that they have never been eradicated
There is a lot of confused misinterpretation in this thread. “Can’t be taken down” was a thing, but it was about how you can lose big chunks of the network and have the rest of it still work. That was misinterpreted at the time, in fairness, and it’s even less true now, where centralization in both the infrastructure and the hosting have a lot of things dropping at once and being disrupted, but it’s still technically true. Ukrainian drones are out there beaming up to satellite internet and being used in active warfare in the middle of a battlefield. Which hey, in that context, robust military communication was the original intent of the network to begin with. Given the previous baseline is wired telephone, the characterization isn’t that far off.
Censorship is different, but also true. You can isolate a chunk of the Internet, and once you’ve done that if you have very centralized control you can monitor it, but that’s a high bar. And of course outside of those cases people struggle to limit communications they don’t want, from nazi chatter to piracy.
At the time I used to think that was a good thing, now… yeah, harder to justify. Turns out “free information” didn’t automatically make everyone smarter. I have lots of apologies to give to teachers and professors of theory of communication that were trying to explain this to us in the 90s and we were all “nah, man, their only crime was curiosity, hack the planet, free the information” and all that.