Home
I imagine that someday any citizen will be able to buy a small computer and connect it to the Internet just to rent CPU time to the public. It will be similar to the way power utilities allow customers to sell solar power back to the grid whenever homes produce more energy than they use.

I realize that something like this is already being done for music file sharing services. And the SETI project can access your unused CPU time to search for ET. I'm talking about an expansion of what already exists. The business model and the legal hurdles are probably bigger obstacles than the technology.

Imagine buying a computer and plugging it into your Internet connection at home. The first menu that comes up allows you to choose between private computing (just you) or public, meaning the world can use your computing power on demand. And you get a discount on your own "computing utility" bill when your CPU is used by others. Depending on pricing and demand, you might get a positive investment return on your capital expense for the computer.

In California, solar customers can reduce their energy bill by the amount of power they "sell" back to the grid. But consumers can't legally sell any excess energy they produce above their own billing level. I assume lobbyists are to blame for this ridiculous situation. For now, let's happily imagine that our hypothetical computing grid doesn't have that limitation.

Someday all of your important files will be stored in the cloud. For many of us, that's already the case. It's time to move our CPU needs to the cloud too. In the future, if you can't afford a computer, you can pay a low monthly fee to have access to spare computing power on the Internet. I'm guessing that might cost $5 per month for the basic package, with a premium subscription service that offers higher speeds. The service should be cheap because most computing power on the planet sits idle most of the day.

With this business model, everyone on earth would have access to the equivalent of a supercomputer in the cloud for a few bucks per month plus whatever they pay for basic Internet access. You'll never have to upgrade your computer, upgrade your software, install anti-virus software, or worry about any of the headaches of computer ownership.

Citizens would need little more than a smart screen with a browser that can connect to the computing grid. That's still a computer, but it can be fairly basic. It just needs a browser.

For this model to work on a large scale you'd need to have WiFi in airplanes and everywhere else citizens need to access the Internet, but we're well on the way to that world.

It's not clear to me that a large company or even a government needs to be involved in building the system I'm describing. You could probably get there with an open software project. In fact, it's probably the only way to get there because large companies have a stranglehold on the status quo.

Data privacy is a huge issue with this sort of business model, obviously. But I wonder if spreading your data and CPU usage across multiple processors and servers might actually give you better security than your current system in which all of your private stuff is conveniently organized on one computer so hackers can easily find it. Instead of having your credit card number stored in one location, the number might be broken up across several servers. If one server gets hacked, the thieves only get a partial number. And they wouldn't have any way to know which servers have the rest of your digits.

By analogy, no one would try to steal your car if they knew it was disassembled and the parts were hidden all over your home. The analogy breaks down because crooks could steal and sell car parts. But if a hacker had only two digits of your credit card number it wouldn't be worth much.

You may now commence shredding this idea.

 
Rank Up Rank Down Votes:  +8
  • Print
  • Share
  • Share:

Comments

Sort By:
Mar 27, 2014
Nerdalize is marketing this sort of grid computing as a space heater for your home. The computer provides heat in your home, it eliminates data centers from having to pay for cooling, and it gives computing power to researchers who need it!

Here's their front page blurb:
"Computers generate heat when they’re working. Instead of treating that heat as waste, like most datacenters do, we use it to keep homes warm and comfortable.

Our heater comfortably warms buildings by doing useful computations. We pay the bill so you save on heating costs.

The sustainable computing power generated by the heater is sold to academics, science and industry.

This way homeowners profit from a lower energy bill, businesses from more affordable computing power and all of us from a lower carbon footprint."
 
 
Mar 6, 2014
The only way this approach would work would be if the "terminal" were free or incredibly inexpensive. I can buy a quad-core i7 with a terabyte of storage for under $500. 4-year product lifecycle for most tasks, longer for others, like email and browsing. Unless I'm editing a long HD video, this machine is way more powerful than most things people use computers for (maybe the latest immersive games push the envelope on equipment).

Make the terminal like a cable box, free, with a monthly subscription for grid-based CPU resources, and a chance to add power on an as needed basis for an additional fee.
 
 
0 Rank Up Rank Down
Mar 6, 2014
Well everyone here is talking about the computing aspects, but I would like top point out the physics,

The reason why power companies pay so much for power from a saolar panel is because it lowers transmission costs, from memory I think about 40% of energy is lost as heat by the time it transfers from the power plant to your house therefore 1 watt of power generated at your house is worth more than 1 watt at the power station.

On the other hand, the transmission costs (in energy use) are reversed when the data has to be transferred to an off site then back to where it is used
 
 
+1 Rank Up Rank Down
Mar 5, 2014
My brother and I thought this one up a long time ago when 3D rendering was the big thing. We imagined ourselves has having a large clean room with gazillions of computer and rent out time on them to 3D design houses that didn't have the money for their own render farm. Needless to say we never did it.

On another note, I worked for a university and I was in charge of setting up the software that would be auto-installed on every new computer - about 2000. It was a simple background piece of software that nobody would notice. At night, when all those professors and staff went home, I could fire up the rendering engine and come back the next morning with my beautifully rendered movie.

I was soooooo close. But I didn't do it. But the idea of using so many unused computers seemed like a brilliant idea. Still do, but it would be for a dedicated industry and purpose, just not your average home user downloading !$%* and checking Facebook.. Our computers nowadays are way over powered for that.

But farming it out to a specific industry and paying those who op-in, might be a neat way to make a few extra bucks or have it applied directly to their service provider! That might be a better solution.

Example: BioTechnics Inc. wants to create a super-virus to wipe out humanity and needs to harness a lot of CPU to crunch the numbers. You sign up for their program and install the software. The amount they pay you depends on the amount of time they use your CPU.

Simple. Throw me a share of stock when it's funded. -Erich

 
 
+1 Rank Up Rank Down
Mar 5, 2014
Got to be honest... this one kinda sucks like a $5 !$%*! behind a doughnut shop. Sure it gets the job done, but it's not the best idea.

The LAST thing I want is some jackhole using my CPU to do ANYTHING. I already fight with background software that is installed with everything now.
 
 
+1 Rank Up Rank Down
Mar 4, 2014
[So...there is no market for it...and it is already being done for voice. Got it. -- Scott]

Yes, there may be a market for that kind of cloud computing. Google's Office-like apps are another example. But the market is for *free* computing. I wouldn't pay $5/month for it, and I'm pretty sure the market would dry up if it costs real money.

I run Linux that gets updated every four years, or so. The few applications I use are super easy to update. Linux tells me there are updates and I push the button if I want them updated. Done. I think most people would be fine and dandy with the current state of their technology for years, though. Most updates aren't really needed to get work done; they just seem to add bells and whistles. And sometimes I hate the changes that come with updates. But with cloud computing, you're at the mercy of designers who thing change is the coolest thing on the planet. You've now lost the choice to upgrade or not.

One other thing - security. Your car analogy has a problem. Sure the crooks might not want to steal my car if it were parted out all over the house. But then again, I wouldn't want to reassemble my car every day to go to work, only to disassemble it when I got home. Same with credit card numbers and other secret info. I would likely have an app that finds all those pieces and reassembles when I go to Amazon. And that app can be dug up by whoever is tramping through my computer.

So maybe a cloud computer might be virus-free, but so what? Most viruses we're worried about nowadays are there to steal your data. Now hackers would just skip the virus development and go direct to my data.

Honestly- I can't see a problem that's being solved here, or at least a problem that's not being replaced with another problem.
 
 
+1 Rank Up Rank Down
Mar 4, 2014
[When you say most people don't need supercomputers you sound to me like the folks who once predicted that the worldwide demand for computers was approximately six. -- Scott]

You still have not mentioned specifically what ordinary users want to do with their computers but can't because they don't have enough processing power.
 
 
0 Rank Up Rank Down
Mar 4, 2014
I work for a major microprocessor company and we have a similar system (or at least we did 5 years ago when I worked in that area). Everyone would have their own UNIX system and it would also run large compile jobs as well as your own stuff. This worked great as a way to get lots of processing power without extra boxes. However, the issue came when someone unplugged or locked up their system. It would kill the job and it would have to be restarted. Expand that across the world, you would have this happen constantly. You would have to do a lot of work to build in redundancy. Not saying it isn't worth it, but I see this more as a useful thing for companies to do instead of individual consumers, who don't really a super computer's power.

But now picture you are a company that has a bunch of servers that are rarely fully loaded. You also occasionally need crazy computing power (compiles, rendering, etc). You could include your computers in this SuperGrid and also pay for time in it. You have a running balance of cpu time that can be filled up when others use your systems and expended when you use the SuperGrid. It could be a moneymaker or just a more efficient so you can essentially use a weeks worth of your computers all at once when you're rendering a scene, which would be great. Also, these computers on the SmartGrid would have a lot better uptime and consistency, and most likely the rate you get for sharing yours would be have a processing power and uptime component to it to make things fair. Users just getting started could just pay for processing power all around instead of investing in powerful systems at first (like those who use amazon cloud services). I think that would be a really useful and profitable system. Heck, if they are smart about it these web server companies could install it on all their systems. They would make money when low load and immediately throttle their SmartGrid traffic when they have more people hitting the server, and if it gets even worse they could borrow from the SmartGrid.
 
 
Mar 4, 2014
"You'll never have to upgrade your computer....."

Unless....

"pay a low monthly fee to have access to spare computing power..."

....you are one of the people who MUST buy and upgrade in order to have the spare computing power to sell in the first place.
 
 
Mar 4, 2014
I wrote...
----------------
There's no market because wholesale computing power is so cheap that those who need it can just buy it (or rent from the cloud, where the costs are less than maintaining a SETI-like infrastructure).

The idea of pushing your heavy compute tasks up to the cloud seamlessly already happens whenever you ask your phone to do voice recognition.
-----------------------------------
Scott replied....

[So...there is no market for it...and it is already being done for voice. Got it. -- Scott]
---------------------------------------------
Hi Scott - I explained myself badly. Yes, it's being done for voice, and other things. What I was trying to say is that it's so cheap to add computing at the cloud, and there's so much overhead in trying to get that resource from unused desktops, that people who need the extra cycles will just add it to the cloud because it's cheaper than going out and soliciting unused PCs. (the capital cost of adding the computing is less than the servicing cost of finding and dealing with the PCs - we've gotten to the point that cycles are nearly free).

Feel free to disagree - but let's agree what we disagree about 8-}

best
/j
 
 
Mar 4, 2014
Stupid local computer posted too soon!

[Well, there's the supercomputer power, no viruses, no software hassles, no physical security risks, no upgrading, no backing up files, no rebooting. Sold yet? -- Scott]

Most people don't need supercomputer power. The only intensive resource use case that applies to a scalable number of people is gaming, but that has mostly shifted to consoles. Most high-resource applications have a very limited audience, and typically require a level of interactivity that is problematic for a distributed system due to network latency (AutoCAD is a good example). Until we have the ansible of Orson Scott Card's imagination, these sorts of applications are going to have to run locally. The people running those applications aren't the target audience of your idea, regardless; they are typically going to be business users who can afford the higher-end software.

The rest of the benefits you list can be just as easily met with cloud computing solutions like Azure or AWS -- more easily, for some of them.

[When you say most people don't need supercomputers you sound to me like the folks who once predicted that the worldwide demand for computers was approximately six. -- Scott]
 
 
Mar 4, 2014

[Well, there's the supercomputer power, no viruses, no software hassles, no physical security risks, no upgrading, no backing up files, no rebooting. Sold yet? -- Scott]

Most people don't need supercomputer power. The only intensive resource use case that applies to a scalable number of people is gaming, but that has mostly shifted to consoles. Most high-resource applications have a very limited audience, and typically require a level of interactivity that is problematic for a distributed system due to network latency (AutoCAD is a good example). Until we have the ansible of Orson Scott Card's
 
 
Mar 4, 2014
@Paladin42

No, the difference is someone could break those things. It would also be dangerous because you would be giving them physical access to a space you own/use. This worry wouldn't translate to renting computer cycles
 
 
Mar 4, 2014
Now if we can apply this concept to puppies, babies, and rainy weather...
 
 
Mar 4, 2014
I don't think this idea will catch on for the same reasons that people don't rent out their car when they're not using it, or rent out space in their house that's not being used at the moment, or let other people use their sports equipment when they're not playing or practicing, etc.
 
 
+1 Rank Up Rank Down
Mar 4, 2014
Scott,

Imagine your brain is directly connected to the cloud. You won't need a visual interface; its all in your head.

And I can leisurely hack into the brains of POTUS, Dick Cheney, Scott Adams and that punk in pink on Coney Island.

Empty minds suck, na?

:)

.
 
 
Mar 3, 2014
I suspect that the cost of the electricity required to run the computer - which more or less scales with how much work the computer is doing - is, or in the near future will be, more than the cost of the hardware (when amortized over any reasonable period).

I'm not quite sure whether that kills this idea, but I believe it puts a major crimp in it.
 
 
+4 Rank Up Rank Down
Mar 3, 2014
The thing I like about Scott Adams' predictions is that it is usually easy to figure out why they are so wrong.

Usually it is because Scott is not thinking like an economist, or at least, not a good one. The bookstores are full of books by smart people (physicists, mathematicians and economists) who tried to play the market and lost a bundle through not thinking through to the Catch-22 which makes markets chaotic. These people did not think like economists either. They didn't even think like GD bankers. They thought like physicists, mathematicians and bad academic economists. Go teach Nicholas Nassim Taleb and your grandmother to suck eggs.

And I believe the financial markets ARE chaotic despite the best efforts of crooks and geniuses to make them work like a cash machine.

Basically this is another prediction that the fashion will swing away from smart terminals to dumb terminals such as most employees used when I was studying computer science in 1979 (first high school class in Comp Sci).

This prediction is made by geeks and journalists on a regular basis and it has always been wrong.

Why? Well, why don't you take the bus or a taxi instead of owning a car?

Think like an economist. How can the consumer rationalize giving all of her personal data away to the infernal maw of the web? This is only a good idea if you have almost infallable security (or infallible-close doesn't count except in horse shoes as they say). It is also only a good idea if you are getting great value for your money. But if you are getting great value for your cloud subscription, Apple is going bankrupt or selling its cloud for less than it could make by putting the RAM into a box and selling the box at a ridiculously high price.

Sure, they are selling their boxes at a ridiculously high price, but they are also charging a ridiculously high price for all the bits that make a computer work--which they have cleverly not put into the shiney box, and for which they are charging Fan Boys and Girls massive amounts of money on top of making computers that cost at least twice as much as a good PC.

So how are you going to get from cheap, reasonably secure smart terminals back to dumb terminals if the consumer isnt getting a sweet deal and the owners of the big smart machines behind the cloud are likely to be exposed as pirates worse than any hacker?

And what of the advertising money? Some people may consent to be data-mined to death or deluged with ads for penis enlargers and art gold with Glenn Becks' ass printed on both sides, but why would anybody give up a sweet deal for techno-plutocrats that allows a handful of corporatist giants to turn you into techno-serfs and milk cows forever without you catching on?

Nope. Chances are that dumb terminals will only make a comeback if China decides that it can't get what it wants from Apple, Google, etc. Because there ain't no money in this business plan. Or if there is, thiings are good enough for the super-rich the way they've been doing things for the last century.

I believe we will see clean, efficient, and couteous mass transit before we see dumb terminals replace smart terminals. The terminals are already smarter than the lusers. You aren't using them, they are using you.

At least Scott is savey enough to be using you, too. There is something to his business plan, but not necessarily the ideas he pulls out of his ass and floats past us lusers. Some of them, as he has admitted are actually thinly disguised experiments. He's more of an engineer than a banker, and that's a good thing as Martha Stewart would say.
 
 
Mar 3, 2014
As someone knee-deep in the computer industry, this thought has definitely crossed my mind before, but I've realized there are a combination of reasons it'll never take off:

- consumers don't need faster processors. A 2nd generation iPad is still more than fast enough to do what 99% of people use computers for - check email and Facebook.
- even if they did....Moore's law. Computers are still getting faster and cheaper. By the time a system like this could be set up, computers would be another 2-4x faster than they are today, even needlessly faster than they are right now.
- even if a consumer did still need a faster processor (for something like video editing, which I do), cloud computation still doesn't make a ton of sense.
- You can split up computation into tiny chunks and send it across the network, because network latencies are so high (compared to latencies inside a CPU) that it'll never be worth it to send small loads across a network.
- for bigger chunks, you could. But this would still involve huge amounts of data (when I'm editing video, we're talking 50 GB for <30 minutes of render time). Internet speeds would have to increase 100x for there to even be a potential time savings.

Regarding renting CPU time:

A friend and I were thinking about this in terms of bitcoin mining - we could code a website that would steal people's spare CPU GPU cycles to mine bitcoin while people browsed the website...but the sad truth is that we would never actually make any money from it. Even with bitcoin mining, which is the epitome of monetizing spare processing power, consumer computers just aren't powerful enough to make actual money from it anymore.
 
 
+4 Rank Up Rank Down
Mar 3, 2014
For businesses, this might *eventually* make sense. Companies not involved in human resources regularly outsource their HR tasks to companies that specialize in it. Same for accounting. ADP anyone? Still, given the other comments on here I am sure you can see why it can never work for most people. But that brings this back around to existing services that small businesses already buy, like web hosting and CAD design.

As for security. Well, it's a matter of scale. Why would a hacker want to get into a single computer to get a single person's information? Maybe if they are rich and not terribly bright. But hackers don't care about those little scores. Oh no. They want the big score. The American Express server. The Visa accounts. Target. See where this is going? It doesn't matter if the parts of your private information, or the security keys to them, are in different places because eventually something has to bring them together to be used. The thieves will target that something. Security through obscurity has never worked out well in the real world, I can't imagine that your version of it will do any better.

 
 
 
Get the new Dilbert app!
Old Dilbert Blog