"the Institute for Advanced Study in Princeton, N.J. The institute was the brainchild of its first director, Abraham Flexner. Intended to be a "paradise for scholars" with no students or administrative duties, it allowed its academic stars to fully concentrate on deep thoughts,"
I highly recommend reading "Anathem" by Neal Stephenson. The basic premise is there are institutions known as "Maths" that are essentially walled temples, the gates of which only open once every 1, 10, 100, and 1000 years (outermost gate annualy, 2nd innermost decmntennially, etc). Inside, a bunch of monks hang around studying "useless knowledge," which in this universe includes things such as particle physics (there's one "thousander" temple in which the monks just sit deep in a mountain staring at a giant pool of water to see if they can glimpse a flash of light).
I always thought it would be really cool to have something similar in our world, I hadn't realized it's been pursued before.
In Anathem, there was a global catastrophe of an unspecified nature caused (or at least blamed on) science. Physicists, mathematicians, etc. are segregated from broader society (and high technology) to prevent future reoccurrences of this because their disciplines are viewed as being too dangerous to allow in the "saecular"(sic) world.
A mountain retreat for "useless" science is a nice idea, but Anathem's "Maths" were something else entirely. They were actually quite dystopian.
The catastrophe (of the main storyline universe) was unequivocally caused by science. If I recall correctly, it was implicitly three orders of magnitude worse than the technological super-weapons of our own World War 2.
Metaphorically, the scientists were then treated like the plutonium pits of nuclear weapons--well isolated, so that they could not form a critical mass and start a runaway chain reaction, and arms-controlled, so that none of them could be exploited by politics.
When the world needed a lot of brainpower, fast, the scientists were taken out of their safe storage and connected to each other using the separate monastic order of technicians, to intentionally cause a runaway chain reaction. Basically, the whole world was already pretty much boned, so the political leaders might as well set off their doomsday device--which was nothing more than to allow the smartest people on their planet to collaborate in real-time. That was the same thing that had previously (nearly) destroyed their civilization.
"Just because we checked our guns at the door doesn't mean our brains will change from hand grenades."
I don't know if they were necessarily dystopian. In the book, the Maths survive throughout global wars and other saecular catastrophes. So sometimes they'd come out of the gate and everyone would be cavemen, other times, 1980s style civilizations with televisions and cars. Seems like a pretty rosy situation to me.
The maths weren't exactly Hogwarts for nerds. It was a way to keep the other academic disciplines for interacting to prevent another catastrophe.
Also silo'ing off knowledge is kinda crazy. If a discovery in a 100 year math happens at year 2, then it take 98 more years before the outside world gets the research.
I agree that sharing knowledge is a good idea, but it doesn't always work that way according to history. For things that are "pure knowledge" such as mathematics or philosophy, sometimes the true geniuses would rather keep their work to themselves. Newton invented calculus and didn't share it with the world until Leibniz took credit for his independent work. Then there are questions of how often people may have just never shared things (or they just pretended to know more than they did), such as Fermat's Last Theorem.
I think some of the people that are attracted to this kind of work are going to do whatever they want to do, incentives be damned.
The article heavily promotes the idea of the Institute for Advanced Study in Princeton, N.J, where both the authors of the article are situated. Richard Feynman rather famously argued against places like that: http://calteches.library.caltech.edu/52/2/dignified.htm
“Nothing happens because there’s not enough real activity and challenge: You’re not in contact with the experimental guys. You don’t have to think how to answer questions from the students. Nothing!”
I didn't read it completely yet but the opening sentence "At least I’m living; at least I’m doing something; I’m making some contribution" is resonating with me strongly.
Coming from an academic family, I tried to do PhD but it was just 2 years of procrastination. So I decided, screw it, I'll find a job, and I am a programmer in a corporation since.
It's pretty good, but somehow I wish I had more time to pursue things that go beyond one quarter. So I wish, like Feynman did, there was a place where I could do normal programming most of the time and yet when I have an idea, to pursue that idea. Google 20% time comes close (although it has really been dead for maybe 10 years), but it's still not quite that.
And you could have that for any field, not just software. So perhaps we don't need ivory towers where people pursue basic research full-time, but something more balanced.
20% time isn't dead, but it's also not useful for what you want it for. 20% time rarely works for projects that stretch much beyond a quarter. The amount of attention you can devote to them prevents you from finishing them in a timely-enough manner that the initial conditions which made you think of the idea still hold.
(Trust me, I spent 3 years of 20% time writing an HTML5 parser that was largely obsolete by the time it was released. It was desperately needed when I started it in 2010 and the only options were validator.nu and html5lib. It wasn't so needed by the time it was released in 2013 and you also had JSoup, Hubbub, html5ever, and probably others to choose from.)
Same reason part-time startups usually go nowhere.
I think that the system that actually does work is something like Silicon Valley, where if in the course of your job, you see an idea, you can just quit and try it. If it doesn't work out, there's always another job out there waiting, which won't hold the work you spent on that idea against you. Anyone with a modicum of financial discipline should also be able to save enough money off a Silicon Valley engineer salary that they have the option to go a year or two without income, too.
That's actually my objection to 20% time, I would much rather take a 100% sabbatical for certain things. It could actually work out to 20% or more on the average, it's just there isn't enough institutional trust in large corporations to pull it off, IMHO.
Anyway, I am glad that 20% still works, it's better than nothing IMHO. I wouldn't do startup because I am not really interested in the logistics of pitching etc.
It's interesting that even though perhaps a majority of people believes that some freedom on the job would be beneficial, we cannot create a society which has this freedom.
I think it's somewhat unfortunate how people have come to associate "startup" with "pitching". Basically 0% of the great technology companies have come out of a businessman pitching a VC on an idea they just had. Rather, it's usually a technologist who builds something, gets a few hundred of his friends to use it, and then happens to meet a VC who says "Look, if you took our capital, your hundreds of users could be billions."
You do have to be really ruthless about what features you cut from v1 of the product for this to work, though, and you have to be somewhat unorthodox about problem selection.
As for freedom - you actually do have freedom to work on whatever you want at work, and this applies to most corporations, not just Google and its 20% time. Most employers would actually rather that you take full ownership of getting the task done, and give them a better outcome rather than blind obedience. The problem is that with freedom comes responsibility. As long as you do exactly what your boss tells you to, you can't be blamed when things go wrong. If you do something additional or contravene your boss's orders, they will usually not mind as long as it works out better than expected. Your boss will usually be happy to claim credit for it, and if you're lucky, you may even end up with a promotion & a raise. But if it works out poorly and your boss has to clean up the mess, you're going to get fired. As long as you're willing to take responsibility for getting fired, you can do whatever you want at work.
I feel like a lot of my effectiveness at Google (and that of many of the coworkers who I considered effective) was that I was willing to get fired. And when, after 5 years there, I felt myself getting a little complacent and hewing a little too closely to the party line, I quit. You still have agency as a human being, even in a big company.
> Anyone with a modicum of financial discipline should also be able to save enough money off a Silicon Valley engineer salary that they have the option to go a year or two without income, too.
Is that supposed to still hold true with a family and mortgage, or is it just me that finds that hard? I can totally understand swinging that with less life responsibilities, but I find it hard to save in this region where the cost of living is so high and there's five of us. :/
It still holds true, but you have to settle for a lower standard of living (in terms of living space) than you would elsewhere in the country.
My wife grew up in Cupertino and thinks it strange that I want our eventual kids to have their own bedrooms; she shared a room with her brother until she was about 10. We live in a 2BR townhome and the other units in the complex (all 2BR) include a family with 4 kids and a grandfather; a family with 3 kids; and a multi-generational family with a couple, the wife's mother, and an infant.
The bargain you make for Silicon Valley is that you'll have great career opportunities, a laid-back work environment, huge diversity of food & culture, awesome outdoor sports and great weather, and a small shot at great wealth, and in return you put up with terrible traffic, near-constant work, lots of superficial people, and crummy housing stock that, if it doesn't fall down in an earthquake, will be eaten by termites in 10 years. Some people like that bargain, some don't - and in particular, it's often a better bargain when you're young than when you're older. If you don't, you should carefully consider whether Silicon Valley is the right place for you.
What I hate is that new construction in the area seems somewhat brain dead too.
I have a fairly large house, but I'm also an hour north of SF, which while better, really just means that prices are very, very high instead of insane. My daughters share a room because even though I'm drowning in space, most of it is very poorly utilized in new construction (and this is new). My home office is in one of the walk in closets off the master, and I comfortably use less than half of it. My master bedroom is large enough to make two rooms easily, but the layout makes that impracticable. The one saving grace? The large formal living room which we converted to the main office for my fiance's business (fully utilized, shelves stacked with supplies on all walls).
The other options in the area? 50 year old houses with 75% the floor space (admittedly better used), and an actual back yard. Same listing price range, but you'll have to fix a lot of stuff immediately or soon.
Being new factored heavily into our decision process, but at this point I'm thinking we might have made a poor choice. Then again, the grass is always greener...
There were guaranteed to be lots of people working on html5 parsers. If you want to have a niche to yourself, you have to pick something more esoteric.
Well, the project started because I needed it for some other project I was working on. I didn't say to myself "My grand contribution to computer science is going to be an HTML5 parser", I just started writing it, and then it was mostly done by the time the other alternatives came out.
From what I hear (from discussion between Googlers here even), is that 20% is alive and well, but is somewhat manager specific in how it's carried out and what is exected.
On the other hand, some descriptions of people using 20% time sounds like it just opens the door to them feeling okay about using company time to persue open source side projects with or without approval, as long as they are getting their work done. That's probably a good middle-road, because as long as the company is happy with your performance, and you are happy with the possibility that not spending that extra time to wow your managers might delay your advancement somewhat, it's probably as open as you could want. This of course assumes a salaried position.
Feynman was talking about IAS in his days. I've been a PhD student in Princeton for 5 years, and there's plenty of interaction between students and IAS faculty. The difference is IAS people aren't responsible for their students. You could make the argument that IAS people aren't FORCED to interact with students, but the opportunities are there I think.
The link that you've provided is the example for not against such institutions. The quote: "There was no importance to what I was doing, but ultimately there was. "
In other ways, it's a great time for useless knowledge. One, barriers of entry are lower in a lot of cases (the internet, computers). Two (partly as a corollary of #1), there are more people empowered to seek useless knowledge (greater education, better communications technology).
This affects some fields more than others: computer science research is cheap, but molecular biology research is still dependent on expensive equipment.
> A committee of the U.S. Congress found that in 2012 business only provided 6 percent of basic research funding, with the lion’s share — 53 percent — shouldered by the federal government and the remainder coming from universities and foundations.
Yet another instance where "big government" is good for society, despite the bad rap that it gets these days.
Looking forward to reading this, but as I don't see a link in the comments here or the body of the article, you really should read the essay by Flexner that's referenced in the headline. It's stored at the IAS here:
It worked for me, but the summary is that in the past there has been a place for 'pure' research (which is to say not motivated by a particular financial goal and without restriction). The article argues that the good that has come of that has been greatly in excess of the investment and so "we" should do more investment like that.
I don't disagree, and open ended research is really helpful in furthering the good of the community. It is also not possible to predict which means typical economic forces are not present to keep it funded.
I've wondered sometimes if the lottery proceeds would be better spent on this sort of activity rather than public schools. The reasoning for that is that the public schools don't have any visibility into how much (or how little) they will get from the lottery in a given year so they have a hard time allocating funds for on-going projects, instead doing one-offs like new classrooms or some other self contained project. Whereas if you dumped that money into a grant pile and people could apply for grants out of it for research, that might work better.
Sadly, because (and the article points this out) you can't say definitively "this research is going to give you these benefits in 10 years" you can't really say that you haven't just wasted all the money you spent. And that is where unconstrained research gets killed.
Perhaps public schools should be funded with something more fair and reliable, such as income taxes. (I'd go for wealth taxes as more fair, but that's a long shot.)
The humanities should get a chance here too. There's a wonderful quote from Peter Berger. Roughly, he said that sociologists had so far escaped public attention in America, except perhaps among the few southern racists sufficiently literate to read the footnotes to Brown vs. Board of Education.
Well, that's the most von Neumann-and-Turing-centric account of von Neumann's role in the invention of computing that I've ever seen: no trace of Eckert, Mauchly, U. Penn or ENIAC in the story at all!
In ancient times, there was a television show called "Connections" [1] in which the host followed the multiple links and serendipitous events that led to some modern inventions, technologies, and scientific achievements.
We need more resources, so that we could dedicate more to open-ended research without expecting it to pay off in any predictable way. You literally never know what can come out of a particular purely theoretical piece of research.
Well, would be great! It would be great to have such a surplus, and spend it on esoteric things that will pay off in a century or two, or mere 30 years maybe.
Are you personally happy to fund it? I'm sure a number of donation programs exist.
Otherwise, I'd concentrate on minimizing resource waste, especially in institutions like government, to obtain that resource surplus in the first place.
The U.S. is far wealthier than any nation ever, including the U.S. of the 1930s and 40s (it's not even comparable). Whether or not it's funded has is not due to affordability.
I know that our budgets are already creaking under the load of our spending and entitlement programs, but I would absolutely love to see a taxpayer-funded initiative designed to funnel money towards anybody who wanted to put it towards use in citizen science or research.
I'm not sure what a program like that could look like in practice, though. You could put money directly into peoples' pockets through a sort of grant-like application process, ("I want to buy $250 worth of resistors, capacitors, chips, wire, and breadboards to make a new kind of fitbit prototype") but that would be an easy system to game, and a lot of the costs associated with experimenting with theoretical ideas comes from equipment anyways.
Because not everyone can afford or find room for a scanning electron microscope or a CNC lathe, right? Many cities have hackerspaces, but their capabilities often stop at hand tools and working with thermoplastics unless you're very lucky. So what about universities? They're present throughout the entire nation, often already publicly funded, and have access to functional research and manufacturing equipment.
Sadly, my experience trying to get even a few hours' use of any sort of university equipment, whether supervised, paid, through night classes, or otherwise, has been met with absolute stonewalls. If you don't pay full tuition, you can fuck right off.
So I'm not sure what the solution here is. What's an average person who wants to get into science supposed to do, besides be independently wealthy?
"I know that our budgets are already creaking under the load of our spending and entitlement programs [..]"
It worries me how that assumption is never challenged even by people who "would absolutely love to see a taxpayer-funded initiative ". The deficit-mania narrative is really hegemonic nowadays.
I was going to say something similar. We could do a lot of good just by shoveling more money to NSF and other established programs, like Obama did in the stimulus. It's a tiny fraction of the federal budget.
You can build a million roads and learn almost nothing. You can send a single probe to Venus and learn enough to fill entire books and launch a thousand careers.
I don't think we need to cut programs, I just think that our current programs aren't administered effectively. We spend trillions and it seems like there's just so much graft and waste.
> it seems like there's just so much graft and waste.
I hear this claim, but I've never seen evidence of it. How much graft and waste is there? In which department?
Every large organization has some of it - name one that doesn't. I have yet to see evidence of how relatively efficient or uncorrupt the U.S. government is. I do know that anti-corruption groups generally rate it highly relative to other governments worldwide, but many of those governments are in poor countries with high corruption (the latter being a major cause of the former).
Sure, but waste doesn't mean we're "creaking under their load," especially when it comes to entitlement programs. If we just wanted to double the amount we spend on science, energy, and the environment, we'd have some inefficiencies but make a small dent in the overall budget.
> our budgets are already creaking under the load of our spending and entitlement programs
Assuming you are talking about the U.S., AFAIK that is not true. The U.S. is the richest country in the history of the world, right now, and by a wide margin. Remember that under President Clinton, the U.S. government ran a surplus and the economy was booming. What changed?
* Large tax cuts for wealthy Americans, reducing revenue significantly
* Two long wars. I know Iraq cost over $1 trillion by itself.
* I think the prescription drug bill passed under Bush was very expensive
* The Great Recession, of course.
The U.S. is much richer now than when it ran that surplus. The main problem is not spending, but revenue. Republicans (not to be partisan on HN, but they are the ones doing it) keep cutting revenue (taxes) and then saying there is no money for anything.
We currently produce far more PHD than "the system" can utilize, so I'd propose it would look like PHD/postdoc extended into retirement as a career as opposed to theoretically being a very narrow academic pyramid.
I wonder if admitting the system is defeated would result in fewer people being pushed into an already overcrowded system.
Essentially the new system would be a vow of poverty and childlessness until retirement, rather than the existing system pretending a high paying job or tenured slot is just around the corner.
You need a relationship with the uni profs such as a perma-phd or else they're worried about liability.
I've found from a lifetime of screwing around that clear thought is necessary and an engineers eye for compromise. Your groundbreaking neo-fitbit probably revolves around code and having an ugly board full of parts, or at least it starts there, long before it needs a 3-d printer of little plastic parts.
you seem somewhat unaware of the realities of how science is done. What is 'citizen science'? how is different from normal science?
""I want to buy $250 worth of resistors, capacitors, chips, wire, and breadboards to make a new kind of fitbit prototype""
Did you read the article? I'm sure new fitbit prototypes are great and all but..it in no way is blue skies or fundamental research (which is what the article argues we need more of).
"What's an average person who wants to get into science supposed to do, besides be independently wealthy?"
do what most people in science do...get a bachelor's degreee and a phd and then enter academia or industry, publish papers, read papers ,attend conferences, teach and mentor students. I don't think there are many scientists who self fund themselves.
"Sadly, my experience trying to get even a few hours' use of any sort of university equipment, whether supervised, paid, through night classes, or otherwise, has been met with absolute stonewalls. If you don't pay full tuition, you can fuck right off."
What type of hardware are you trying to get access to? what are your goals? Is it to produce new scientific knowledge? then you would probably be best served by going to grad school..
edit: however, one does not have to attend grad school to do useful research in exceptional circumstances. see chris olah at google deep brain.
edit: one does not even have to be all that young. look at yitang zhang
>I know that our budgets are already creaking under the load of our spending and entitlement programs
You are welcome to cut my social security and medicate "entitlements" but I've been paying into them for 20 years so be prepared to write me a check for every penny taken from me. Its not an "entitlement" if I've been forced to pay into it for decades. I would be very careful with that word if I was you as its politically charged.
Also the idea that "just throw more money" at a problem is tempting but often wrong. It won't bring some utopia it'll just bring a lot of grant chasers going after bureaucrats who need to get rid of the money. Is there evidence the sciences are actually underfunded? By what measure? And which research specifically. Also the "cant have 9 women make a baby in 1 month" truism applies. Throwing more resources at something doesn't necessarily lead to the outcome you expect.
"Throw more money" at things shouldn't be an unquestionable truth. We did this with college loans and now college is unbelievably expensive and education outcomes questionable, especially since we saw the equivalent of 'grant chasers' in education in the form of diploma mills.
>You are welcome to cut my social security and medicate "entitlements" but I've been paying into them for 20 years so be prepared to write me a check for every penny taken from me. Its not an "entitlement" if I've been forced to pay into it for decades.
Yeah, that's not how it works. Your money is gone and isn't coming back, and the Supreme Court ruled that you have literally zero right to anything. ( https://en.wikipedia.org/wiki/Flemming_v._Nestor )
> "Throw more money" at things shouldn't be an unquestionable truth. We did this with college loans and now college is unbelievably expensive
I think that's not the cause. I've seen research showing that tuition increases are strongly correlated with cuts in government funding for higher education.
If there's some government program that hands out up to x amount of dollars for product y, then product y will quickly cost x amount of dollars.Education cuts have nothing to do with this.
If anything colleges are rolling in money. If they aren't providing resources needed for their staff, then thats something staff needs to take up with the college. Blaming the government because these colleges just keep dumping money into their vast endowments is fairly ridiculous. Gutting my medicare and social security to preserve endowments is asinine.
Those are interesting theories, but do you have any support for these statements? They don't match what I understand, the facts (taxpayer support for higher education has has been cut widely), and what research I've seen shows.
EDIT: Regarding the links you added (thanks):
* The Slate article says, based on a study, it might be possible that student loan availability could be increasing the cost of college, but it's only a possibility. The article also says that among the biggest culprits are for-profit institutions and advocates blanket funding of higher education as a solution, a la Hillary Clinton and Bernie Sanders.
* The New York Fed link is to the study covered by the Slate article.
* The Forbes page doesn't want to load.
I don't see that as backing much of the parent comment. Also I'll note that a great proportion of funding for higher education is not via student loans, but taxpayer funding for research and for public universities and colleges.
I highly recommend reading "Anathem" by Neal Stephenson. The basic premise is there are institutions known as "Maths" that are essentially walled temples, the gates of which only open once every 1, 10, 100, and 1000 years (outermost gate annualy, 2nd innermost decmntennially, etc). Inside, a bunch of monks hang around studying "useless knowledge," which in this universe includes things such as particle physics (there's one "thousander" temple in which the monks just sit deep in a mountain staring at a giant pool of water to see if they can glimpse a flash of light).
I always thought it would be really cool to have something similar in our world, I hadn't realized it's been pursued before.