Hacker Newsnew | past | comments | ask | show | jobs | submit | bajsejohannes's commentslogin

Capitalization is ignored by the compiler. So you can call it REPEAT, repeat, rEpEaT and so on. Same for variable names, functions, etc.


Which is something that can cause annoying bugs when 2 identifiers that are "obviously" different when you see them in CamelCase are interpreted as identical by the compiler…


I don't know, but anecdotally, I still use it for work, but no longer for personal chats.

It also seems like someone at Slack is tasked with driving up engagement, because I get these "Your team is missing you" messages from Slack (only to be find a dead slack community). That might be a sign that they're losing traction?


Another version that's useful is this ASCII version: https://gabmus.org/posts/raspberry_pi_pico_pinout_in_your_te...

I keep a slightly modified version of it as a top comment in my main C file in every pico project. Super handy for quick reference and you can annotate it with the actual uses in your project.


I did something like this called “picopins” (pip install picopins) which gave a CLI ASCII-like pinout with search.

ASCII-only really cuts to the meat of the problem though.


Thanks! I've been using pinout.xyz quite a few times; maybe you should link from there to the pico versions so it's easier to discover?


Agreed. Thanks!

I have definitely struggled with making the Pinout spinoffs discoverable- the OG site had ten plus years to bed in.


One thing I don't understand from watching the video, is what happens in the (very rare) case that you get collisions all the way down the funnel. I assume this is related to the "One special final level to catch a few keys" (around 14:41 in the video), but given that it has to be fixed size, this can also get full. What do you do in that case?


The dereference table allows allocations to fail:

https://arxiv.org/pdf/2501.02305#:~:text=If%20both%20buckets...

(the text fragment doesn't seem to work in a PDF, it's the 12th page, first paragraph)


Thanks! So I guess the best recourse then is to resize the table? Seems like it should be part of the analysis, even if it's low probability of it happening. I haven't read the paper, though, so no strong opinion here...

(By the way, the text fragment does works somewhat in Firefox. Not on the first load, but if load it, then focus the URL field and press enter)


Yeah, I presume so. At least that's what Swiss Tables do. The paper is focused more on the asymptotics rather than the real-world hardware performance, so I can see why they chose not to handle such edge cases


This bothered me too, reading it and the sample implementations I've found so far just bail out. I thought one of the benefits of hash tables was that they don't have a predefined size?


The hash tables a programmer interacts with generally very much have a fixed size, but resize on demand. The idea of a fixed size is very much a part of the open addressing style hash tables -- how else could they even talk of how full a hash table is?


The UX on MacOS is so bad here. First, a notification prompts you to enable Apple Intelligence. When you dismiss the notification by clicking the "x" in the corner, it instead opens the system settings and proceeds to download something (?) before showing you a checkbox where you can enable/disable it. It feels quite forced.


Desperate product owners resort to desperate measures to juice metrics.


My company is tiny comparatively (~150ish employees) and even I've heard various PMs unironically say things like "Let's enable it by default for everyone, because we have those KPIs to hit!"

I've fought back against so much BS like this, but it's just endless and I can't win'em all. Who cares about good UX, not nagging our (paying) customers incessantly about stupid features nobody has ever asked for while the core product languishes and 80% of our customer feedback is "Please make the platform more stable"? All that matters is AI, and that EVERYONE is forced into using AI so our CEO can say in a slidedeck that we've gained X% usage of our shiny new AI thing (that everyone subsequently disables as soon as they can).

It's a fucking joke honestly, this whole industry is a complete farce.


I don't know -- I don't think that there's a particular social contract (much less a legal one) between companies and users that the offering they provide today will be unchanged forever.

I don't mean to defend the dark pattern in this particular case, I'm responding to you saying "this whole industry is a complete farce". If a company decides that The Way to use their product needs to be nudged in a different direction, they can. (Almost) nobody complained when macs started shipping with Rosetta [0] installed.

I'm nowhere near as confident that Apple Intelligence is worth betting the goodwill of users on as I was about apple silicon + rosetta for intel binaries, but it's Apple's bet to make.

[0] okay, a stub launcher for intel binaries that made it super quick and easy to get Rosetta installed


It's really pathetic. Reminds me of when phone manufacturers make a hardware button stop doing what people are used to and make it do Feature-Of-The-Month. Sorry, your power button now turns on FeatureX instead of toggling power. All so a fraction of a percent of users accidentally and unintentionally invoke unwanted features.


Screams in Bixby

Looks at the TV remote with its "Netflix" and "Apple TV" buttons

Screams even louder


My Roku remote still has an Rdio button. Makes me sad each time I notice it.


Streaming device makers make a significant share of their gross margins from sales of those buttons.

(That and an Apple-like mandatory revenue share on the channels that users choose / install.)


Before 2010s software didn't feel like it was different features competing for attention. Is product owner a new invention or what else happened?


It's not just "product owners." When you're one of 100 teams in BigCorp, your team might own Feature X, and another team owns Feature Y. If teams with more "successful" features grow faster, get more funding, get more compute time, get bigger, fatter org charts, then your whole team is incentivized to fight to make Feature X more prominent and elbow out Feature Y.

As an end user, when you start your device or application or web page, know that the features that are exposed in the first screen, and "above the fold" as they say, that premium placement was likely fought bitterly over, through epic corporate political battles and backstabbing. They're not there because research showed that users want them conveniently located.


Raymond Chen, 2006:

https://devblogs.microsoft.com/oldnewthing/20061101-03/?p=29...

> I often find myself saying, “I bet somebody got a really nice bonus for that feature.” “That feature” is something aggressively user-hostile, like forcing a shortcut into the Quick Launch bar or the Favorites menu, like automatically turning on a taskbar toolbar, like adding an icon to the notification area that conveys no useful information but merely adds to the clutter, or (my favorite) like adding an extra item to the desktop context menu that takes several seconds to initialize and gives the user the ability to change some obscure feature of their video card.

> The thing is, all of these bad features were probably justified by some manager somewhere because it’s the only way their feature would get noticed. They have to justify their salary by pushing all these stupid ideas in the user’s faces. “Hey, look at me! I’m so cool!” After all, when the boss asks, “So, what did you accomplish in the past six months,” a manager can’t say, “Um, a bunch of stuff you can’t see. It just works better.” They have to say, “Oh, check out this feature, and that icon, and this dialog box.” Even if it’s a stupid feature.

This bullshit has been with us since there have been desktop computers with notification areas.


Have you forgotten that clippy used to knock on the glass if you ignored it?


The actual feature set is rather disappointing, too. I don’t want magical summaries of texts or notifications. I don’t want a poor implementation of an email categorization feature that’s years late to market. I do want better Siri, but that means more actual capabilities to control things, especially when triggered from a watch. I don’t want slow, unreliable language models that still can’t get “call so-and-so on Bluetooth right” [0].

What I do want is privacy-preserving AI-assisted search, over my own data, when (and only when) I ask for it. And maybe other AI features, again when and only when I ask for it. Give me hints that I can ask for such assistance, but don’t shove the assistance in my face.

[0] Somewhere along the line this improved from complete fail to calling, with Bluetooth selected, but audio still routed to the watch until I touch the phone.


I agree with OP that it's unnecessarily confusing. A "method" is a procedure. The floating point number is the result of that procedure, not the procedure itself.

"Decimal" implies a ten based system, even though it's perfectly fine to say "binary decimal".

Using your own replacement words, it would be clearer to write "A floating point number is a representation of a number with a fractional part".


Maybe it should've said "is a method of storing" instead of "for". It would make it clear it's not talking about a procedure, but a way or manner of doing something.


How much of luggage handling is the airline vs. the airport crew? I had assumed it was mostly the latter.


And also the whole airport luggage handling system. These are not simple anymore, but instead massive automated systems. With all the usual issues of dealing with real physical objects in addition to identifiers associated...


> admittedly in spurts

I haven't watched his videos, but this seems like a sign of quality. Making a video when you have a great idea vs making a video a week or day just to feed the algorithm.


There's also MPU in even simpler/cheaper MCUs. For instance, ARM Cortex M0+ sports an MPU, and this architecture is used in STM32C0 ($0.24 in bulk) and RP2040.

I have no idea how the landscape looks in general, though.


The vast majority of modern MCUs have enough memory protection for Tock. Anything cortex-m0+ or "better" has an MPU. RISC-Vs PMP or ePMP as well. Most 16-bit "legacy" (though still popular) MCUs don't.

Virtually anything with a radio these days (the MSPs were holdouts but mostly those are Cortex-M these days as well)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: