Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The Era of Quantum Computing Is Here. Outlook: Cloudy (quantamagazine.org)
198 points by jonbaer on Jan 24, 2018 | hide | past | favorite | 99 comments


> Note that I’ve not said — as it often is said — that a quantum computer has an advantage because the availability of superpositions hugely increases the number of states it can encode, relative to classical bits. Nor have I said that entanglement permits many calculations to be carried out in parallel.

I'm really glad to find they did not go for these usual pieces of pop-explanations, and even called them out as not really capturing the essence of quantum computing. It takes courage to not hide behind these inaccurate descriptions, and instead just give the unsatisfying truth that there's no simple layman level description beyond "quantum mechanics somehow creates a “resource” for computation that is unavailable to classical devices"; at least not with today's easily available mental models[1] we have from classical computing.

[1] http://lesswrong.com/lw/kg/expecting_short_inferential_dista...


I'm also glad they didn't fallback on the "easy" explanation. Funnily enough, an SMBC comic [0] is what made quantum computing "click" for me, and how it's not really easily mappable back to classical ideas. The comic was a joint effort with Scott Aaronson and SMBC creator Zach Weinersmith, and as far as I can tell is pretty close to a faithful explanation of the basics without diving into the technical details.

[0] https://www.smbc-comics.com/comic/the-talk-3


If this comic isn't diving into technical details then I must be an idiot


I meant more that it's not trying to act like a whitepaper and "prove" the concepts it's explaining, which I think a lot of explanations of really complex topics end up spending way too much time on and lose the reader.

There are parts of the comic that I don't even begin to understand (I honestly couldn't tell you if hilbert space is even a real thing, and I don't even want to being to think about how or why amplitudes can be complex, and what you would possibly do with a complex probability...), but the whole idea of amplitudes being loosely analogous to probabilities,and how interference is the real "secret sauce" that allows quantum computers to gain a big advantage in some problems is what it finally made click for me. And it kind of got across the point that the real difficulty with quantum computing is going to be in isolating these "qbits" from the world (which made the originally linked article much easier to read).

I've also attempted to read a bunch about it from many different sources before this, so I had some primer on some of the ideas behind it, but couldn't understand how the parts fit together, or why they were such a big deal (often flip-flopping between the 2 feelings depending on the metaphors the last thing I read used).


> I don't even want to being to think about how or why amplitudes can be complex, and what you would possibly do with a complex probability

I don't pretend to completely understand it either, but one crucial point here is that these complex amplitudes and their interactions are the "real" reality - they are "more real" than the normal probabilities we are used to, in that they are the things that actually exist in the universe and give rise to the 'normal' probabilities that we see.

A lot of mental resistance to quantum mechanical ideas comes from thinking quantum mechanics is weird, instead of thinking quantum mechanics as normal and us being weird[1], because of the scale we live in and normally observe things in.

[1] paraphrased from Eliezer Yudkowsky, Harry Potter and the Methods of Rationality, Ch 36


To chime in on the probabilities issue: they aren’t really probabilities.

Probability amplitudes are something we don’t really understand, but it is true that once we have set a basis for measurement, we can treat the amplitudes with respect to that basis as probabilities (written in complex form).

However, this should not be interpreted as “quantum states are probabilistic mixtures of base states” - changing the measurement basis screws up that intuition. For example, if our new basis has one base vector that is equal to the quantum state we are measuring, suddenly our outcome is certain.

Oh I wish I could explain this better.


I feel like that explanation almost gets me there, but understanding it is just out of reach.

Out of my many questions (for which I should probably seek books and videos), I'll ask this one:

> changing the measurement basis screws up that intuition. For example, if our new basis has one base vector that is equal to the quantum state we are measuring, suddenly our outcome is certain.

How does one change the measurement basis - is it a mathematical exercise done on paper, or is it an actual physical act?

Is this what is meant by collapse of a wave function?

Is this what we are doing when we 'measure' Schrodinger's cat at the end of the experiment and suddenly have a certain outcome?


Different measurement instruments have different bases. But we can also model our quantum system and imagine what would happen if we created a new instrument with any basis we choose.

Measurement is indeed wave function collapse - when a quantum state in a superposition (with regards to our measurement basis) is measured and produces a definite answer.


Wow, I don't think I've ever seen someone cite fanfiction on HN before! But considering HPMOR introduced me to Worm [0] and Unsong [1], I really can't argue with doing so. Also a relatively good way to introduce college friends to rationalism, if for some reason recommending fanfiction to friends is easier than recommending lesswrong.

[0] https://parahumans.wordpress.com/ [1] http://unsongbook.com/


Well, if you want a non-fanfiction citation, there's this from Eliezer's regular non-fiction blogging:

Reality has been around since long before you showed up. Don't go calling it nasty names like "bizarre" or "incredible". The universe was propagating complex amplitudes through configuration space for ten billion years before life ever emerged on Earth. Quantum physics is not "weird". You are weird. You have the absolutely bizarre idea that reality ought to consist of little billiard balls bopping around, when in fact reality is a perfectly normal cloud of complex amplitude in configuration space. This is your problem, not reality's, and you are the one who needs to change.

http://lesswrong.com/lw/hs/think_like_reality/

(Really, a lot of insights in HPMOR are just reiterations of points Eliezer made in his Sequences.)


Let's just say it's important to distinguish between what Yudkowsky considers to be quantum mechanical ideas, and what the equations and experiments actually say.


Yeah, the comic is probably quite a bit more accessible. Although I'm still left wondering if there is a proper ELI5 for quantum computing somewhere out there


I really don't think there is, and I don't think there will be one ever.

If quantum computers ever become "mainstream", I think we will need to start teaching the basics behind them in schools, because there isn't an easy way to map the ideas there to classical ones.

I think of it like imagine trying to explain how a CPU works to a person that doesn't know basic math. There is no way to even "ELI5" that, there just isn't enough of a foundation to work from, and any attempt to do so would take so many liberties that it wouldn't even really be useful. Right now quantum computing seems extremely difficult to even understand the basics because 99.99%+ of us are missing those foundations.


But if you're explaining it to a 5 year old, they've hardly been poisoned by the classical ideology taught in schools. You couldn't analogize it to a classical concept anyway because they wouldn't have the background. So maybe there would be an ELI5 that would be effective, if it was intended to be understood by someone without the baggage of traditionalism. :)


Sure, but the true ELI5 would take years to get through the necessary fundamentals.


And end with a diploma for the now young adult.


The sentence in the last box made me chuckle. It reads like a parody of something Deepak Chopra would write.


With SMBC there is always a "hidden" last panel too, you can see it by clicking the big red button at the bottom right of the comic.

Although this one won't make any sense unless you read a lot of these kinds of webcomics!


The general idea I've had is that a quantum computer will be much more similar to a classical analog computer than digital. It requires setting up a quantum system which models the problem physically, and relies on quantum properties (interference i suppose) to descend on the correct solution without doing any sort of brute-forcing. Thanks for the comic link, really helpful.


Thanks a lot for that link. That comic was a ton of fun (as one would expect from SMBC).


LOL. From your link :

The Kid : Wait, you guys put complex numbers in your ontologies?

The Mom: We do and we enjoy it.

The Kid : Ewww


I don't know much about quantum computation, but David Deutsch does say something along those lines. In particular, that the superiority of quantum computation for certain problems is explained by computation taking place in multiple universes.

He also rejects the idea that there's something ineffable about quantum mechanics, or that the reason quantum computation works can't be explained using ordinary logic and reasoning that is accessible in principle to everybody (I don't know for sure that's what is meant here by "somehow creates a “resource”": but I'm triggered a bit by those words because people often do essentially mean that).

Of course he would (correctly) say that one shouldn't treat him as being an infallible source of truth on the matter. But he does believe that the purpose of science is to explain the world, which puts him in a different class from apologists for quantum mysticism, strict empiricism and other believers in explanation-free knowledge.


Rigetti Quantum Computing [0] is a YC startup that actually manufactures real quantum computers that are accessible through Forest.

There are a couple neat videos about the tech. [1,2] You can access the service through a Python API [3].

Disclaimer: I work there.

[0] https://www.rigetti.com

[1] “Lisp at the Frontier of Computation” https://youtu.be/f9vRcSAneiw

[2] Rigetti talk at QIP 2017 https://youtu.be/IpoASc18P5Q

[3] https://github.com/rigetticomputing/pyquil


I asked some people a few years back if some equivalent to Moores law on the qbit size, and what it meant for cryptography applied. Their answer was that there wasn't good evidence of a direct linkage at the qbit density we had now, to achievable breaks in RSA, but that also the fundamental metric was probably wrong right now. Yes, how many qbits is interesting. But also how stable they are, how many states you can demonstrate, which specific solutions you target in open questions which QC can address.


My area of research is primarily post-quantum cryptography, not strict quantum mechanics. That said, based on my understanding of the implementation details of quantum computers (as opposed to e.g. quantum complexity), I’m pessimistic about quantum computers being productively used against real world RSA in the next 20 years. I’d conservatively estimate that it could happen in 50 with a series of breakthroughs.

I think quantum computers will be useful for some applications (maybe even many) and commercially available within the next decade or two, I just don’t think cryptographic breaks will be possible on them until they advance well beyond whatever is first brought to market. You need logical qubits for polynomial-time cryptanalysis of RSA, and many of them. To get these logical qubits, you need so many more physical qubits that it’s not really productive to map our current records (~50 qubits) to what we’d need. In fact, I predict we’ll get to quantum cryptanalysis via a paradigm shift that approaches the problem in a fundamentally different way long before we master the requisite error correction to have enough physical and logical qubits necessary under the current paradigm.


Are quantum computers useful for cryptanalysis in the case where you don't quite have the necessary number of qubits? Like let's say you want to factor an integer N, which would require n qubits, but you have a QC with only n/2 qubits. Could you use the QC to narrow down the search space and then finish with a classical computer, or is the QC completely useless in this case?


The article mentions this problem, and there seems to be a concept emerging that is called "volume": Basically, how complex the algorithm can be you can run on a quantum computer, taking into account the number of qubits, their error rate, the stability, ...


I hope that we see some analysis done on this subject in the near future, because the rising uncertainty regarding current public-key tech is starting to get difficult for certain businesses.


This is a very active area of research in theoretical cryptography, and has been for over a decade. It’s an area that I myself work in. The uncertainty is not quite what you think: researchers more or less know what will happen to public-key cryptography, and to which cryptosystems. We also have credible proposals for encryption, key exchange and digital signatures. The uncertainty is when quantum computers will be practical, not what will happen when they are.

Contemporary research is predominantly concerned with improving computability efficiency or security proofs for underlying complexity characteristics.


> The uncertainty is when quantum computers will be practical, not what will happen when they are.

That's what I meant above; I apologize for I should have been more clear in my wording.


Ah, no worries. Post-quantum cryptography has a reputation for being somewhat excessively theoretical (which to be fair is a reasonable argument). But there are benefits other than post-quantum security. For example, lattice problems can provide security according to worst-case hardness, whereas factoring problems can only provide average-case hardness.

In other words, for some lattice problems, we can prove that breaking the cryptosystem is equivalent to breaking every instance of the lattice problem, but factoring problems only guarantee that breaking the cryptosystem is equivalent to breaking a subset of problems from some distibution. The former is a much stronger security assumption (though in practice this comes with its own set of challenges).

It’s an exciting area of research.


There is a Public crypto system based off error correcting codes that is quantumly secure. It's the McEliece cryptosystem, it's about as old as RSA. However, the key sizes are in 10's of thousands to 100's of thousands of bits.

From my reading most crytographers think it's is secure. So if you need it now. You could probably get away with using it. However, one should have a decent explanation to why use a less well known algorithm. It's probably had less eyes looking at so a major flaw/attack could exists, but it's not if one does exist.


McEliece has had significant cryptanalytic attention since the 70s, when it was first invented. In fact, it’s probably the most well studied post-quantum proposal, and it’s not really waiting on any further cryptanalysis. It’s currently considered secure (with binary Goppa codes) and has modes for both encryption and digital signatures via Neiderreiter, but it doesn’t support key exchange.

The real issue, as you touched on, is key size. There are production deployments of the cryptosystem, but it’s actually a bit worse than your numbers if you’re looking for post-quantum resistance. In the post-quantum secure setting, McEliece uses public keys that have an upper bound of over 1MB (around 8.5 million bites specifically) in order to achieve 128-bit security.

Fortunatly McEliece is not the only proposal we have, and error correcting codes aren’t the only computational problem being studied. We also have credible proposals from lattices, hashes, multivariate polynomial equations and (most recently) supersingular elliptic curve isogenies.


From my research so far it looks like SIDH is the most promising candidate to upgrade existing protocols (relatively congruent to ECDH; "small" key sizes).

For proprietary protocols the "newiness" of pq-crypto doesn't seem to be a big problem per se, just use curve hardening (kdf(ECDH || pq-kex)) in case the pq-crypto is broken (either due to implementation defects or cryptanalysis).


It depends on the application. SIDH is attractive because it doesn’t throw out decades of elliptic curve mathematics because of the spectre of quantum computers. The mathematics is a bit more familiar if you squint at it, though working with isogenies themselves is still very different from normal elliptic curve cryptography. But more importantly, a codebase that implements elliptic curve arithmetic can port much of that functionality to implement SIDH.

And yes - SIDH is also attractive because of the small key sizes. But it’s also significantly slower than lattice-based key exchange using something like Learning With Errors (LWE). In practice the decision comes down to time versus space constraints: if your application is space-poor and time-rich, SIDH is a good proposal for key exchange. This looks especially nice in the context of IoT devices. But if your application is space-rich and time-poor, SIDH looks less attractive in favor of other options.


I am out of the loop, are there any "real" quantum computers yet? I was under the impression that no one was really making actual quantum computers yet, but some sort of annealing machine like d-wave. Are these IBM machines finally "real" quantum computers?


D-Wave is not a "real" quantum computer in the sense that it is not asymptotically faster than a classical computer. Quantum computers with very few Quibits have existed for a while -- I believe the current record is 17 quibits [1]. I believe that to factor an n-bit integer, you need roughly n bits. So right now, we can factor 17 bit integers, which isn't very useful yet. However, I have read that for quantum simulations, it 30 quibit quantum computers might already be useful.

[1] https://en.wikipedia.org/wiki/Timeline_of_quantum_computing


It is worse than that. To factor n bit number you need roughly n "logical" qubits, i.e. qubits that can store information almost indefinitely. The quantum computers that we have contain low quality "physical" qubits that lose their content after a few millisecond. You need to put an error correcting code on top of the physical qubits to encode the logical qubits. You can expect to need 100s or 1000s of physical qubits per logical qubit. I.e. to decode modern public key encryption you need a computer with millions of qubits. Today we have less than 50 in the best research hardware on the planet.


This one I don't get. Isn't "a few milliseconds" more than enough time to gather a guess on integer factorization (and most of the useful problems)? Wouldn't it suffice to run your program again and again over the same short-lived qubits?


Or a good cavity or two.


Yes. Volkswagen has started using Google's 20-qubit quantum computer (or possibly even the 49-qubit one, which will be announced this March):

https://media.vw.com/releases/951

IBM is also giving select customers access to its real 17-qubit quantum computer.


What does a car manufacturer do with a quantum computer?


Quantum computers can be used for some optimization problems.


There are no commerically useful quantum computers yet. Asking whether there are "real" quantum computers is like asking whether 6 noisy transistors wired together in a lab in 1940s were a real (classical) computer.


For a great perspective read this:

https://arxiv.org/abs/1801.00862

This is a write up of his keynote talk at Q2b this year.


There are a lot of proposals for making logical qubits. One of the more favorable is a two dimensional surface code [0]. Current experiments suggest that it is possible to scale superconducting processors to around 100 physical qubits [1] of relatively high quality. It should be possible to perform significant error correction experiments on them.

As the article points out, noisy quantum computers won't be useless. They could be used to speed up some optimization and quantum chemistry tasks.

[0]: https://en.wikipedia.org/wiki/Toric_code

[1]: https://www.nature.com/articles/s41534-016-0004-0


Microsoft's Q# is interesting. It's framed in a way that seems weirdly premature, but it is probably the nicest platform to learn about and play with quantum algorithms. Better than multiplying huge matrices in Octave, for sure.


It's definitely the most developer-friendly language that heavily borrows from their functional F# language. Microsoft has put a heavy bet into simulation so that they have a bit more of a runway before they need to produce a working quantum computer based on topological architecture. Interesting that they are targeting developers whereas most other languages are targeting scientists.


It's written in F# so it makes sense it draws inspiration from it.


That’s outstanding. Have you solved anything with it?


only extremely basic stuff, nearly done with Deutsch-Josza now.

But again the alternatives are basically just representing your qubits as huge vectors and gates as huge matrices and just doing lots of multiplication, which is an extremely cumbersome and frustrating way to program and precludes any kind of abstraction, even for toy learning problems.


The article, as carefully as it seems to be written compared to many pop sci articles on the topic, contains many errors or mischaracterizations. For instance:

Some researchers think that the problem of error correction will prove intractable and will prevent quantum computers from achieving the grand goals predicted for them.

The author then provides a quote intending to justify this claim, but the quoted mathematician merely points out the difficulty of error correction (by--correctly--pointing out that quantum error correction is more difficult than merely proving quantum supremacy, i.e. producing a quantum result that cannot be accurately be replicated using a classical computer), and doesn't say the problem of error correction will prove intractable.

The author seems to be painting a picture of quantum computing that is less certain than is warranted. A counter-point to the popular hype of quantum computing is necessary, but it shouldn't include such inaccuracies.


Forgive me if I've missed it but I haven't seen any evidence that quantum computing has been used to solve even a very simple calculation. It seems that the purpose behind trying to build a quantum computer; is you really need to be out in front, not necessarily that it is possible.

Are quantum computers just the fusion reactors of the computing world? Maybe one won't be built for 20 or 30 years, maybe longer.


> Are quantum computers just the fusion reactors of the computing world?

This is not the proper take-away message from the history of fusion. The expert consensus in 1976 predicted that useful fusion would not be achieved if it followed the funding trajectory it actually did end up taking.

https://news.ycombinator.com/item?id=8307584#8311566


> Forgive me if I've missed it but I haven't seen any evidence that quantum computing has been used to solve even a very simple calculation.

Factoring small numbers is a "very simple calculation," and it has been done multiple times with quantum computers: https://phys.org/news/2014-11-largest-factored-quantum-devic...


Google has been doing some interesting stuff already, like representing models of the hydrogen H2 molecule:

> While the energies of molecular hydrogen can be computed classically (albeit inefficiently), as one scales up quantum hardware it becomes possible to simulate even larger chemical systems, including classically intractable ones. For instance, with only about a hundred reliable quantum bits one could model the process by which bacteria produce fertilizer at room temperature. Elucidating this mechanism is a famous open problem in chemistry because the way humans produce fertilizer is extremely inefficient and consumes 1-2% of the world's energy annually. Such calculations could also assist with breakthroughs in fundamental science, for instance, in the understanding of high temperature superconductivity.

https://research.googleblog.com/2016/07/towards-exact-quantu...


"The largest known number factored by Shor's algorithm on a quantum computer is 21."


At this point yes . We need ~70 years of research to get to something useful ( I asked my professor about this who I studied quantum computing this).


This is not a typical view among experts. Most numbers quoted are between 10 and 30 years for commercially useful devices.


Those quotes have been around for the last 10 to 30 years, however. So 70 is a very sane number.


I work in the field and I will have to disagree - 20 years ago nobody serious would have claimed we can build a quantum computer in 10 years. There are very clear levels we need to reach to have something useful (in terms of lifetime of the qubits and number of connected qubits) and there has been extremely clear upward slope on both of those measures over the last 15 years. Extrapolating optimistically from them in 5 years we will have quantum chemistry simulators. Extrapolating pessimistically, it will take 20 or so years.


Not quite true. I can think of two well funded people in 1995-2000 who either believed (or implied to funders) that scalable quantum computing was plausible within 10 years. But those people were outliers. You're correct that it was an uncommon view, and mainstream consensus in the field was that it was many decades away. The situation today is certainly very different, and people seem much more optimistic.


This is simply not true.

I'm not talking about the pop science articles you read who find the most outlandish quote they can find. (There are plenty of those now that say 1-3 years.) I'm talking about what the actual typical practitioner says.


Relevant xkcd:

https://xkcd.com/678/


I honestly cannot remember the last time a new technology hasn't significantly overshot ETAs


That's because with new developments that do get achieved in predicted timelines, the predictions don't get mention because it's only unsurprising ordinary news. Whereas with tantalizing but out of reach technologies, we get repeated "are we there yet" articles, and skepticism makes for more popular journalism. It's a combination of selection bias and the availability heuristic.


Really? How about mapping the human genome?


The discussion reignited my desire to learn quantum mechanics on the side. Usually the Geiffiths book is recommended, is this still the best intro to quantum mechanics?


I used both Griffiths and Townsend’s books in undergrad QM classes, they complement each other nicely. It’s been awhile, but IIRC Griffiths starts off with a focus on continuous wave functions while Townsend starts off with spin and discrete states.

If you’re interested in quantum computing, spin is the most applicable place to start, as a qubit can be identically represented as a spin-1/2 particle.

I wouldn’t recommend Sakurai (which someone else mentioned) if you’re just starting. It’s what I used in grad school, and doable depending on your math background and motivation, but not the easiest intro.


Certainly a matter of taste. I'd recommend "Modern Quantum Mechanics" from Sakurai.


I wonder why we should expect practical quantum computing to be here any sooner than practical nuclear fusion?


what will happen to RSA encryption once they are here?


Gone, but there are quantum safe algorithms, we just need to migrate to them as industry.

Is it called post-quantum cryptography.

https://en.wikipedia.org/wiki/Post-quantum_cryptography


Qubits are a bad idea. You want continuous variable quantum computing. https://en.m.wikipedia.org/wiki/Continuous-variable_quantum_...


The wiki article says twice that a the motivation for studying continuous variable quantum computing is to understand "the ways in which quantum computers can be more capable or powerful than classical ones". But it never explains why you want to use continuous variable quantum computing over qubits. What are your arguments for why the correct path forward is not qubits?


Why do the real numbers solve more rational equations than the integers? They’re both infinite (though not equinumerous), but the integers can only come close to the reals.


And this line of reasoning completely fails when you start considering error correction. This is why we need bits and qubits. Aaronson's lecture notes go into the mathematical details if you are interested.


But now you have to demonstrate that your analogy holds.


This is pretty common misconception (that qubits are a bad idea). Qubits or qudits are necessary to achieve scalable error correction. I work at one of the research institutes that pushes the use of continuous variable systems (they are great). But we implement qubits on top of them (that is the whole point).


What makes qubits a bad idea? What problems of qubits does continuous variable quantum computing solve?

I’m not being Socratic, I just don’t know.


My guess is the quantum computer that Feynman theorized about was continuous. I’d bet it needs to be continuous to solve all the NP problems and so on.


No variant of quantum computing is expected to solve NP problems in polynomial time. That's just a common misconception.



In principle. Not in the actual universe we inhabit, with its Beckenstein Bound.


Arbitrary precision (not infinite) is good enough for me.


Sort of like coming arbitrarily close to the speed of light, in that it will require energy on the same asymptotic scale you’re hoping to achieve precision in. Degenerate matter computers?


"Real computation," which allows you to solve NP and #P problems in polynomial time, require infinite precision. With just arbitrary precision it isn't any more powerful than classical computation.


"Real computation" is a flawed model because it does not permit scalable error correction mechcanisms. Check out Aaronson's lecture notes if you are interested in the rigorous argument.


I disagree with Aaronson but am not quite there to have a serious discussion about it.


If you think real computation is in any way feasible why don't you build one and make a silly amount of money?


Those seem more similar to old school analog computers, in a sense.


The interesting thing is that a perfect analog classical computer is more powerful than the qubit quantum computer model (which is what people mean by quantum computing). But analog computers (whether classical or quantum) do not permit error correction, and as such are useless for anything but the smallest problem. You need bits and qubits in the real world.


They very much are.


They’re equivalent. A continuum can be encoded into qubits and vis versa.


They definitely aren’t.


Consider an array of qubits forming a register. Now, a cavity resonator’s state can be expanded in a photon number basis. If you truncate the cavity resonator to 2^n photons, then map the 0 photon ket to the 0 state of the register, the 1 photon key to the 1 state of the register, all the way up to the 2^n photon state of the register to the 2^n photon state of the register. Now you’ve mapped the cavity state onto the register and they’re informationally equivalent


sure, however analog computing has the advantage that you can solve optimization problems natively, in hardware. with qubits, you get a speedup but not O(1) solutions.


Absolutely not. The time complexity is exactly the same. A quantum computer can simulate quantum annealing with exactly the same asymptotic complexity as an “analog” physical system that takes advantage of quantum mechanics would.


I disagree but ok.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: