It’s possible, but the whole reason Starlink has a userbase is because there’s large areas where traditional telecom companies either never had enough interest to set up infrastructure or let what had already been set up rot. If that were to change meaningfully it’d require a significant shift in strategy from Verizon, AT&T, etc.
Outside the US there is major major FTTH rollout going on everywhere. UK will get to ~90%+ penetration within a few years (it's at 66% now, 83% if you include HFC DOCSIS3.1 and growing at ~1% a month, including some very very rural areas).
All of Europe is basically like this, and will have/already has overwhelming FTTH coverage. The same is happening in Asia and even in the more developed parts of Africa.
So every month the addressable market for Starlink IMO declines. There will of course be places that are extremely rural that won't get FTTH for a long time (but I think they will eventually), and underdeveloped countries will struggle to roll it out for a long time too - but there is a lack of capacity in these underdeveloped places to pay for starlink.
What Starlink as actually amazing at is bringing the price of fixed line connections down. A lot of countries have ridiculously high fixed line/mobile data costs (I would assume some level of corruption is happening to keep competition out). Starlink will push those prices down and force providers to offer unlimited data packages in those areas. However, I'm not convinced Starlink will see the benefit of a lot of that.
Don't get me wrong - Starlink is an awesome service that really benefits humanity. However, I think the long term economics for it are poor for it to grow substantially more (this may be ok as I believe it is EBITDA positive now). And I think churn will be a problem in developed countries as more and more of them get FTTH.
How could it be a US centric view when the US has faster bandwidth than almost every country in the world? It's ranked 5-6th in the world for average bandwidth and 10 in median.
I literally said that it wasn't true. There is a common misconception that the US is miles behind other countries in terms of FTTH coverage and/or rollout has stalled.
I think T-Mobile trying to buy US cellular suggests they are approaching 5G home internet as a real growth opportunity. The infrastructure rollouts are just far more reasonable tha trying to run wires everywhere.
Yeah land based networking just makes less and less sense when you look at it from first principles, wireless is fast enough for most connections it has enough bandwidth if distributed and being in orbit makes widely distributed networks trivial compared to doing it on earth.
I dont know if it will be starlink but I do expect(hope) ground based telecom to go the way of floppy disks in the coming decades
Unless you on a boat being in land is a huge advantage. 5G only needs ~10,000 towers to cover most people on the US. From there you scale density based on where people actually live.
Satellites however move so you need to accept the network sucks in moderate density areas or you have vastly more coverage in low density areas than you need.
Wired connections inherently provide a lot more bandwidth, and in dense urban environments the last mile isn’t a mile it’s within a building.
Can you site that number? It really doesn't pass the sniff test for me, unless the word "most" is doing some pretty heavy lifting.
A quick search suggests that the range of a 5G tower, operating only at low/mid-band spectrum (so in other words, below peak speed - but at higher range) can only operate in the 1 to 3 mile range. [1] We'll say 2. That's an area of pi*2^2 = ~12.5 square miles, we'll say 13. The area of the US is 3.8 million square miles. So your number would provide coverage for (10,000 * 13) / 3.8 million = 3.4% of the US. That maybe enough to cover the most exceptionally dense urban locations, but you're missing a lot of people there.
And, again, this is just for the low/mid-band stuff. And then you need to regularly maintain those towers. While you could get global coverage with relatively few satellites that can just be trivially remotely launched/decommissioned. A quick search there [2] turns up a current practical (not peak/theoretic) bandwidth for Starlink in the 100+ Mbps range + ~50ms latency. I have difficulty seeing a logical argument for ground based telecom, beside as a hedge against WW3 when probably the first thing that will happen is a huge chunk of all satellites going poof.
Your first link says “On average, the maximum usable range of a cell tower is 25 miles.” Which is relevant because we aren’t trying to provide service just for high density areas.
Before you ask if 10,000 * 2,000 = 20 million square miles is high, handoffs require you to be in range of multiple towers so there’s a lot of overlap and hills, ocean, etc that reduce useful range.
Ultimately there’s 142,100 cell towers in the US, but that’s including density from urban areas and redundancy from multiple cell networks. Anyway, the point I was making was that Starlink is targeting low density areas by necessity they simply can’t target NYC density for any reasonable constellation size. However, if you’re a cellphone company and you’re already covering anywhere in the US with 50+people per square mile extending that to anywhere with 5+ or even 0.5+ people per square mile and killing Starlink just doesn’t take that may towers.
"Usable" is going to mean at the max possible wavelength. The problem with telecoms is that there's a physics imposed inverse relationship between frequency (speed) and wavelength (penetration/distance). So it's not like computing where we basically have gotten a free lunch with stuff that goes faster, runs cooler, and takes up less space.
Each upgrade with telecoms entails a sacrifice. You can have really fast signals that can't go far and have difficulty penetrating obstacles like walls/buildings/hills/etc, or you can have really far reaching and high penetrating signals that can't go fast. So for instance Verizon's max speed towers can only reach 1500 feet [1], so I think my estimate of ~2 miles was a pretty reasonable meet in the middle.
All that said I agree with you in principle. Obviously space based telecoms are much better for less populated areas than heavily populated, but I'd argue that that space based can scale much more easily. The ground based telecoms aren't just those 140k towers, but also the other 450k nodes on top. And that's to cover a pretty small geographic area. And each of those nodes not only needs land and construction permits, but they also need to be be regularly maintained, and so on. It's a pretty big deal. For space based coverage, you can just launch your satellites from Texas and have them providing coverage on the other side of the world in a matter of minutes.
Put another way - imagine we were creating a civilization from scratch and these technologies were all 'unlocked.' I don't think we'd be using ground based stuff much at all. In the present when the infrastructure already exists, there's no reason not to take advantage of it, but in general it just doesn't scale so well.
Don’t forget those towers covers the vast majority of US population with high speed connectivity, where Starlink only has ~1 million US customers 1/300th the population. Those ratios aren’t that off in terms of customers per unit, but the problem with scaling satellites is they don’t stay in one location.
You can’t just put 50 satellites next to each other over a suburb and call it a day you need a ring(s) of satellites circling the entire globe to reach whatever your target density is along their full orbit. Unfortunately, most land has really low density North Dakota only averages 11 people per square mile, while Florida a mostly empty state sits at 422.
Target 10 people per square mile (adjusting for household size and rates percentage of people signing up) and just about all your satellites are useful across the entire US.
But Pick 100 people per square mile 90% of your time over North Dakota is wasted. Worse large chunks of Florida are also nearly empty as most of its population is along the coastline in places like Sweetwater where 8,800 people per square mile live. So your wasteful 100 people per square mile in ND still only covers a small fraction of the population in Florida.
Cellular is the reverse the first 10k towers are largely “dead weight” that cover few people per tower, but the rest of the 130k are really useful because you optimize locations for density. Swap that to satellites initially the constellation has very high utilization, but the ratio keeps getting worse as you add more satellites.
PS: Starlink could try to vary speeds or prices more based on density, but people really want predictable results for their money.
This is what I was trying to say above but much more detailed an eloquent. Theres a lot dislikes in this thread but not many folks addressing the points.
The thing about high density places with a ton of infrastructure is that wired will always be the best because you have close access to infrastructure and its likely to already be built in. For rural areas or even suburban areas the equation starts to tip to orbital wireless for the reasons you state above and also geographic realities make ground based wireless unreliable in places where there are mountains valleys canyons etc.
There’s a 4 orders of magnitude difference between high density areas and low density ones. So no you don’t need millimeter wave everywhere. You can increase bandwidth per tower, but you can also the number of cell sites.
Further every frequency you add removes users from other frequencies. IE: At 10 miles you can use a subset of frequencies, but those frequencies don’t need to cover for people 100m from the cell tower because those are on 5G.
Thus double the number of cell sites means there’s an extra circle of people on mm wave frequencies around the new towers. Thus you more than double effective bandwidth in low density areas when you double the number of towers.
Meanwhile the reverse happens with satellites. For a given number of satellites there’s some areas where you have sufficient capacity for the density at those area. Suppose you have enough satellites for ships and aircraft over the ocean, add new satellites to handle higher density and the time those satellites are over the ocean isn’t getting you new customers. IE the percentage of time the average satellite is at 90+% capacity drops when you add more satellites.
Building things deployed to land in real life, basically sucks. Building out a tower requires buying the land (or even possibly getting involved in extremely dirty eminent domain lawsuits), getting countless building permits/inspectors, architecting your building in accordance with local regulations and any sort of geographic peculiarities, organizing a construction team, [finally] building it, and then maintaining the building itself as well as the various regulatory regulatory, tax, and other requirements that come with such. And that's for exactly 1 tower! And you really cannot overstate how big of an ordeal this is. If you think NIMBYism is bad for housing, think about how people feel about building phallicy energy generating towers reaching hundreds of feet in the air around them.
By contrast SpaceX: build satellites, launch satellites, done. They can launch tens (and soon hundreds if not thousands) from their base in Texas with a single launch. There's still some bureaucrazy they have to deal with, but this is overall just a many orders of magnitude greater difference in terms of scalability and overall ease. And when satellites start hitting end-of-life - no problem, just deorbit them and continue expanding the swarm.
I think this is only kind of true. 10000 towers is a lot when you consider that probably half of them are in very low density areas. LEO based satellites are a really good way of covering the desolate areas (oceans, deserts, farm land with very low population density etc), and areas with medium density and lots of elevation changes that would mess with ground based coverage. Starlink was never going to work well for urban (or even sub-urban) areas, but airplanes, boats, and ~20% of the US population is nothing to scoff at.
Cell towers already exist to provide cellular service in low density areas. In general people expect service to be generally available not simply be available in cities and subdivisions.
Some of that is provided for free when covering higher density areas which satellite networks like Starlink simply can’t handle at any kind of reasonable constellation size.
Satellite internet is inherently a niche market. The physics simply don't allow for example good satellite coverage of even 10% of NYC, because you just can't get enough satellites flying over NYC at the same time to handle that much data. And there are limits to how much better you can make the individual satellites themselves, since you have a very limited heat budget in space, where you have to rely 100% on radiative cooling.
So, for any square mile of land, you can have at most some small number of subscribers. Sure, you can cover a huge surface, but only with a very low density. Conversely, the vast majority of the world's population lives in huge clumps in small areas.
Billions of people live in low density areas. Hundreds of millions live in NYC-type very high density areas. Satellite can still serve very high density, just at limited rates. Satellite internet is very far from niche.
Starlink should (annd perhaps anlready do) approach commercial high rises with a single uplink for the building, shared through the building network. Even as a backup data system, still very valuable system.
Satellites really can’t provide coverage at full 1 acre lot suburbs level density let alone NYC level density.
Starlink has ~1 million customers in the US from ~6,000 satellites so you’d think they could do 10x that with 10x the number of satellites. But much of the US is low enough density that their current constellation is already sufficient and effective bandwidth per satellite is maximum bandwidth * percentage of orbit in useful locations. Which means 10x satellites are closer to 3x useful bandwidth and it gets much worse the higher density you’re aiming for.
Ahh you might think just offer lower bandwidth per customer in urban areas, but people will pay less as the bandwidth drops and Starlink is already fairly slow.
I don't think satellite is cost effective for high density areas. If you have enough subscribers in a given geographic location, it makes more sense to put up some cellular towers, which can be maintained by people on the ground.