Like most college students, I am annoyed that I am poor. I would like a way to sell the spare computing power of my laptop over the Internet to people who would pay for it, like deep learning folks. I would be willing to share 50% of the profits with anyone who can figure out how to do this.
Haha the problem is that even if you have a pretty souped up gaming desktop, its computing power is probably worth less than the power costs, so you’d basically be selling just your room’s power.
Maybe you live in a dorm and you don’t have to pay for that power, but even then, we’re talking about pennies.
The problem of “college students are annoyingly poor” is a big niche. What do you know about converting your time to money through your computer?
Good point, though there should be value on the other end at least. For example if 100 people on a network each need more than their laptop’s computing power 1% of the time, in the ideal case, the average person would get a 100 times speed up for that 1% of the time without providing a credit card. So they could train an image classifier in 6 minutes instead of 10 hours.
Also I should admit that I’m only poor in the relative sense—I need rice, beans, and a few dozen square feet, and I have those things covered.
Hmm it probably is more lucrative to convert my time to money, though I think it’s better to invest my time in increasing my future earnings, which would probably be way better than what I could make as a part-time-working college student.
Actually, my biggest gripe about my life right now is that college is inefficient in so many ways (500 person lectures, required classes that are mostly wastes of time, absurd tuition), yet I don’t know how I could get the things I like about it (flexible schedule, great peers, some extremely good teachers, excuse to be a student) somewhere else.
These 100 strangers who need bursts of computation can pay $5 to spin up a powerful Amazon EC2 instance for a couple hours. That seems like a good deal for the value they’re getting, and very hard to undercut. So I see no startup opportunity.
Re college...
If “flexible schedule, great peers, extremely good teachers, excuse to be a student” is really what you want, I can easily get you all that for only $10k/year, a fraction of what you’re probably paying now. But the truth is, college’s main value-add is the expectation of a better career.
These days, college is doing a pretty terrible job of helping people get any careers at all. I know 4 separate people who got their college degree, couldn’t get any jobs, trained a few months in software engineering through bootcamps or online, then got 6-figure software engineering jobs.
Khan Academy and various coding bootcamps are already becoming a viable alternative to college, and I don’t see an obvious niche for a new startup.
Re college, you shouldn’t compare to “regular education” graduates people who went to college AND a bootcamp, you should compare them to those who did NOT go to college at all and instead only took courses online or went to a bootcamp.
I don’t have any data, but my feeling is that the latter will consist of two starkly different categories: those at the right end of the IQ tail who’ll succeed regardless; and those who decided to be cheap and are fucked because “you only have a high school diploma??” is going to be the standard reaction at their prospective employees.
RE amazon instances: I recall that amazon EC2 had a “free tier” where they allowed you several hundred hours of CPU time (albeit not very powerful). So perhaps for half the strangers, it would cost them $0. Even less opportunity for startup.
There is pretty much no use cases that benefit from high latency clusters of computers. We’re talking hundreds or thousands of times less efficient. Nice idea in theory, doesn’t hold up in practice.
Neural networks seem like they would benefit from high-latency clusters. If you divide the nodes up into 100 clusters during training, and you have ten layers, it might take each cluster 0.001s to process a single sample. So the processing time per cluster is maybe 100-1000 times less than the total latency, which is acceptable if you have 10,000,000 samples and can allow some weight updates to be a bit out of order. Also, if you just want the forward pass of the network, that’s the ideal case, since there are no state updates.
In general, long computations tend to be either stateless or have slowly changing state relative to the latency, so parallelism can work.
Sorry, I was using “high-latency clusters” as a term to refer to heterogeneous at-home consumer hardware networked over WANs, as the term is sometimes meant in this field. The problem isn’t always latency (although for some work loads it is), but rather efficiency. Consumer hardware is simply not energy efficient for most categories of scientific work. Your typical, average computer plugged into such a system is not going to have a top of the line GTX 1080 or Titan X card with lots of RAM. At best it will be a gaming system optimized for a different use case, and probably trades off energy efficiency at peak usage in favor of lowering idle power draw. It almost certainly doesn’t have the right hardware for the particular use case. SETI@Home for example is an ideal use case for high latency clusters, and by some metrics is one of the most powerful ‘supercomputers’ in existence. However it has also been estimated that the entire network could be replaced by a single rack of FPGAs processing in real-time at the source. SETI@Home and related projects work because it is “free” computation. But as soon as you start charging for the use of your computer equipment, it stops making any kind of economic sense.
SETI@Home for example is an ideal use case for high latency clusters, and by some metrics is one of the most powerful ‘supercomputers’ in existence. However it has also been estimated that the entire network could be replaced by a single rack of FPGAs processing in real-time at the source.
The electricity you pay for the computing is more expensive than the produced value. That’s why CPU bitcoin mining with spare consumer hardware isn’t profitable.
Additionally there’s trust involved. Nobody has a good reason to trust you to do the calculations exactly the way they desire.
Like most college students, I am annoyed that I am poor. I would like a way to sell the spare computing power of my laptop over the Internet to people who would pay for it, like deep learning folks. I would be willing to share 50% of the profits with anyone who can figure out how to do this.
Haha the problem is that even if you have a pretty souped up gaming desktop, its computing power is probably worth less than the power costs, so you’d basically be selling just your room’s power.
Maybe you live in a dorm and you don’t have to pay for that power, but even then, we’re talking about pennies.
The problem of “college students are annoyingly poor” is a big niche. What do you know about converting your time to money through your computer?
Good point, though there should be value on the other end at least. For example if 100 people on a network each need more than their laptop’s computing power 1% of the time, in the ideal case, the average person would get a 100 times speed up for that 1% of the time without providing a credit card. So they could train an image classifier in 6 minutes instead of 10 hours.
Also I should admit that I’m only poor in the relative sense—I need rice, beans, and a few dozen square feet, and I have those things covered.
Hmm it probably is more lucrative to convert my time to money, though I think it’s better to invest my time in increasing my future earnings, which would probably be way better than what I could make as a part-time-working college student.
Actually, my biggest gripe about my life right now is that college is inefficient in so many ways (500 person lectures, required classes that are mostly wastes of time, absurd tuition), yet I don’t know how I could get the things I like about it (flexible schedule, great peers, some extremely good teachers, excuse to be a student) somewhere else.
These 100 strangers who need bursts of computation can pay $5 to spin up a powerful Amazon EC2 instance for a couple hours. That seems like a good deal for the value they’re getting, and very hard to undercut. So I see no startup opportunity.
Re college...
If “flexible schedule, great peers, extremely good teachers, excuse to be a student” is really what you want, I can easily get you all that for only $10k/year, a fraction of what you’re probably paying now. But the truth is, college’s main value-add is the expectation of a better career.
These days, college is doing a pretty terrible job of helping people get any careers at all. I know 4 separate people who got their college degree, couldn’t get any jobs, trained a few months in software engineering through bootcamps or online, then got 6-figure software engineering jobs.
Khan Academy and various coding bootcamps are already becoming a viable alternative to college, and I don’t see an obvious niche for a new startup.
Re college, you shouldn’t compare to “regular education” graduates people who went to college AND a bootcamp, you should compare them to those who did NOT go to college at all and instead only took courses online or went to a bootcamp.
I don’t have any data, but my feeling is that the latter will consist of two starkly different categories: those at the right end of the IQ tail who’ll succeed regardless; and those who decided to be cheap and are fucked because “you only have a high school diploma??” is going to be the standard reaction at their prospective employees.
RE amazon instances: I recall that amazon EC2 had a “free tier” where they allowed you several hundred hours of CPU time (albeit not very powerful). So perhaps for half the strangers, it would cost them $0. Even less opportunity for startup.
There is pretty much no use cases that benefit from high latency clusters of computers. We’re talking hundreds or thousands of times less efficient. Nice idea in theory, doesn’t hold up in practice.
Neural networks seem like they would benefit from high-latency clusters. If you divide the nodes up into 100 clusters during training, and you have ten layers, it might take each cluster 0.001s to process a single sample. So the processing time per cluster is maybe 100-1000 times less than the total latency, which is acceptable if you have 10,000,000 samples and can allow some weight updates to be a bit out of order. Also, if you just want the forward pass of the network, that’s the ideal case, since there are no state updates.
In general, long computations tend to be either stateless or have slowly changing state relative to the latency, so parallelism can work.
Sorry, I was using “high-latency clusters” as a term to refer to heterogeneous at-home consumer hardware networked over WANs, as the term is sometimes meant in this field. The problem isn’t always latency (although for some work loads it is), but rather efficiency. Consumer hardware is simply not energy efficient for most categories of scientific work. Your typical, average computer plugged into such a system is not going to have a top of the line GTX 1080 or Titan X card with lots of RAM. At best it will be a gaming system optimized for a different use case, and probably trades off energy efficiency at peak usage in favor of lowering idle power draw. It almost certainly doesn’t have the right hardware for the particular use case. SETI@Home for example is an ideal use case for high latency clusters, and by some metrics is one of the most powerful ‘supercomputers’ in existence. However it has also been estimated that the entire network could be replaced by a single rack of FPGAs processing in real-time at the source. SETI@Home and related projects work because it is “free” computation. But as soon as you start charging for the use of your computer equipment, it stops making any kind of economic sense.
I would be interested in a cite on that estimate.
Personal conversation with SETI.
The electricity you pay for the computing is more expensive than the produced value. That’s why CPU bitcoin mining with spare consumer hardware isn’t profitable.
Additionally there’s trust involved. Nobody has a good reason to trust you to do the calculations exactly the way they desire.
How much do you think your spare computing power is worth?