Hacker Newsnew | past | comments | ask | show | jobs | submit | janis1234's commentslogin

I found reading Linux source more useful than learning about xv6 because I run Linux and reading through source felt immediately useful. I.e, tracing exactly how a real process I work with everyday gets created.

Can you explain this O(n2) vs O(n) significance better?


[dead]


I still don't quite get your insight. Maybe it would help me better if you could explain it while talking like a pirate?


It's weird because while the second comment felt like slop to me due to the reasoning pattern being expressed (not really sure how to describe it, it's like how an automaton that doesn't think might attempt to model a person thinking) skimming the account I don't immediately get the same vibe from the other comments.

Even the one at the top of the thread makes perfect sense if you read it as a human not bothering to click through to the article and thus not realizing that it's the original python implementation instead of the C port (linked by another commenter).

Perhaps I'm finally starting to fail as a turing test proctor.


> Each step is O(n) instead of recomputing everything, and total work across all steps drops to O(n^2)

In terms of computation isn't each step O(1) in the cached case, with the entire thing being O(n)? As opposed to the previous O(n) and O(n^2).


But the code was written in Python not C?

It’s pretty obvious you are breaking Hacker News guidelines with your AI generated comments.


Have you considered EBPF filter that looks for 'Mozilla/5.0 (compatible; crawler)' and drops packets from that IP for 1 hr where it just straight drops packets. I.e, this is probably best way to handle bots, don't even reply so they have to timeout which usually is a few seconds.


good sick of all the cheap Chinese products that break in a week and have to order another. Even the clothing seems to fade and be useless in 6 months.


You could always buy more expensive clothing, now you’ll be forced to


from my understanding RISC-V chips are slower and more expensive and less optimized compilers, so why in the world would an end user use one?


No? Performance is implementation specific, they’re usually cheaper than ARM since there’s no core ISA license overhead, and while the core instruction set being extremely limited does cause a little bit of tension in compiler land, most cores with a baseline set of extensions get reasonable code generation these days.

One of the main reasons RISC-V is gaining popularity is that companies can implement their own cores (or buy cheaper IP cores than from ARM) and take advantage of existing optimizing compilers. Espressif are actually a perfect example; the core they used before (Xtensa) was esoteric and poorly supported and switching to RISC-V gives them better toolchain support right out of the gate.


You are really only correct in your last point as the advantage of RISC-V is to the company implementing their own core, not to the end user.

The reason is that CPU cores only form a tiny part of the SOC, the rest of the SOC is proprietary and likely to be documented to whatever level the company needs and the rest if available hidden under layers of NDA's. Just because the ISA is open source does not mean you know anything about the rest of the chip.

saying that, the C5 is a nice SOC, and it is nice that we have some competition to ARM.


But where do the original Xtensa cores place then?


Can what you explain what the significance of the US 10 year yield going up is?


It will affect anything which comes due in 10 years. So say you want to take out a 20 years mortgage, the payments that are going to happen 10 years plus will have their rate influenced by the 10yr. OR say you’re trying sell a business. When they look at the future earnings of that business, a higher rate on the 10 year will mean they give you less for that business, because the money in 10 years time is worth less today.

Also if you’re going to retire in 10+ years sucks to be you, your 401k just became a lot less valuable and the income you can expect in retirement will be lower.

Now because nothing happens in a vacuum, the 10yr spiking will affect other yields too.


It means that they are not being bought. Normally, if the stock goes down, treasury yields go down because investors use them as safe havens for their money, so there's higher demand for them and hence the yield goes down.

So why isn't this happening now? There's probably a bunch of reasons. It seems there's a liquidity problem in the markets, which is also weird because stocks are being sold, so there should be enough liquidity, but it seems people are rather holding on to their cash and not invest in anything. However, it seems still treasuries are being sold like crazy. For a while, even gold went down, which is also highly unusual during a market crisis. One reason that is frequently cited now and also in this article is because some hedge funds have specialized on the so called "basis trade", which I never heard of before. It basically means buying large amount of treasuries and speculating on their futures. In itself, this does not bring much money, but it seems this is done on a huge scale, and it's currently going very wrong. Now they are getting margin-called and need to sell assets.

At some point, the FED would normally intervene and buy treasuries. However, the FED still has a ton of treasuries from the COVID crisis on its balance sheets and I think they will not intervene until the market has gone quite a bit lower, so I'm still invested very much bearish, although I lost quite a bit from that when Trump announced the delay, but two days later and we're already close to being back where we started and I could kick myself why I even reacted, but you can't win in this game when it's so distorted, so I'm mostly staying out now and just leave my few remaining PUTs open, and I guess many do the same. The US economy is now fully dependent on the whims of a corrupt president, whose friends currently surely make a ton of money on insider knowledge, so there's zero trust in anything anymore.


How can you be bullish after all of that? Tariffs are still on for China, and the administration has been caught manipulating the market. You're either in cash or holding the bag while they tank the market.


Sorry, I meant bearish, I edited my article... :-)


>It seems there's a liquidity problem in the markets, which is also weird because stocks are being sold, so there should be liquidity.

This is a normal relationship. The financial system has leverage. When liquidity dries up and sellers dominate, volatility rises which feeds back into valuable risk metrics and realize portfolio volatility. That causes participants to downsize their exposure to keep the same risk. In these cases the participants don't necessarily have extra liquidity to deploy until after volatility and uncertainty settles.

>For a while, even gold went down, which is also highly unusual during a market crisis.

During a true crisis this is actually not that uncommon. In the trading space, you saw what you can during these events. In extreme circumstances where correlations all go to one even the safe havens get sold off to raise cash. The gold liquidation was a sign of pressured margin metrics and very high stress. You can see how quickly gold recovered after the peak stress, confirming that situation.

>One reason that is frequently cited now and also in this article is because some hedge funds have specialized on the so called "basis trade", which I never heard of before. It basically means buying large amount of treasuries and speculating on their futures.

The basis trade is more of an arbitrage than a speculation. The global eurodollar system (not currency, global offshore dollars) is about 20 trillion dollars in size. The basis trade is essentially a method in which regulated US financial markets are exported to The world at large to be used in the global, emergent financial system. Instead of buying a treasury directly, one can use the futures/eurodollar market, and the basis traders will arbitrage the difference and keep everything in line.

>At some point, the FED would normally intervene and buy treasuries.

The issue that the FED is facing isn't so much their balance sheet capacity but the tariff-driven inflation outlook and how it interacts with their mandate. That said, they did clearly communicate today that they would be willing to step in if there were severe signs of systemic instability. The reality is... if tariffs really do cause an inflation wave, yields at 5% may be relatively low...


Thank you for these insights, I appreciate it.


I heard a rumor that China has been dumping some of their dollar-denominated assets as a sort of "bad dog" reaction to the current state of affairs.


it mean it's harder for the us gov selling bonds


Book is 10 years old, isn't it outdated?


Even Russel and Norvig is still applicable for the fundamentals, and with the rise of agenic efforts would be extremely helpful.

The updates to even the Bias/Variance Dilemma (Geman 1992) are minor if you look at the original paper:

https://www.dam.brown.edu/people/documents/bias-variance.pdf

They were dealing with small datasets or infinite datasets, and double decent only really works when the patterns in your test set are similar enough to those in your training set.

While you do need to be mindful about some of the the older opinions, the fundamentals are the same.

For fine tuning or RL, the same problems with small datasets or infinite datasets, where concept classes for training data may be novel, that 1992 paper still applies and will bite you if you assume it is universally invalid.

Most of the foundational concepts are from the mid 20th century.

The availability of mass amounts of data and new discoveries have modified the assumptions and tooling way more than invalidating previous research. Skim that paper and you will see they simply dismissed the mass data and compute we have today as impractical at the time.

Find the book that works best for you, learn the concepts and build tacit experience.

Lots of efforts are trying to incorporate symbolic and other methods too.

IMHO Building breadth and depth is what will save time and help you find opportunities, knowledge of the fundamentals is critical for that.


Have not read the book, but only deep learning has had such wild advancement that a decade would change anything. The fundamentals of ML training/testing, variance/bias, etc are the same. The classical algorithms still have their place. The only modern advancement which might not be present would be XGBoost style forests.


Machine Learning concepts have been around forever, they just used to call them statistics ;0


Nope, and AIMA/PRML/ESL are still king!

Apart from these 3 you literally need nothing else for the very fundamentals and even advanced topics.


This is one of the most acronym heavy discussions I've ever seen. I searched "AIMA/PRML/ESL" to find the books, and the first result is a Reddit thread with most upvoted comment "Can we use the names of the books instead of all acronyms, not everyone knows them lol".


You're right.

AIMA is Artificial Intelligence: A Modern Approach by Stuart Russell and Peter Norvig.

PRM is Pattern Recognition and Machine Learning by Christopher Bishop.

ESL is Elements of Statistical Learning by Trevor Hastie, Robert Tibshirani and Jerome Friedman.


Depends on what your goal is. If you’re just curious about ML, probably none of the info will be wrong. But it’s also really not engaging with the most interesting problems engineers are tackling today, unlike an 11 year old chemistry book for example (I think). So as interview material or to break into the field it’s not going to be the most useful.


I have read parts of it. It arguably was already "outdated" back then, as it mostly focused on abstract mathematical theory of questionable value instead of cutting edge "deep learning".


Any recommendations?


Wonder what percent of illegals are criminals (other than being illegals). My guess is 10 percent (but no idea Trump probably thinks 80 percent) and is why most people in California consider illegal immigration as acceptable.


interoperability with subpar user experience is just an excuse for poor engineering or low resources. I.e, my x-wifi-network card doesn't work in Linux. No one is spending time making it work / too many devices to test properly. It is the manufacturers responsibility to make it work with linux and they don't care so there are a few people that make it work and write generic drivers that may or may not be optimized to the specific manufature. Same story for all " interoperability subpar user experience"


Unfortunately many of these scientist stopped doing science and became activist.


$1 for 1000 pages seems high to me. Doing a google search

Rent and Reserve NVIDIA A100 GPU 80GB - Pricing Starts from $1.35/hour

I just don't know if in 1 hour and with a A100 I can process more than a 1000 pages. I'm guessing yes.


Is the model Open Source/Weight? Else the cost is for the model, not GPU.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: