Hacker Newsnew | past | comments | ask | show | jobs | submit | more jauntywundrkind's favoriteslogin

I was part of the Mozilla team that kickstarted the WebVR spec that predates WebXR. I’ve been also maintaining A-Frame (Framework to develop Web based AR/VR experiences) for more than 8 years. This announcement makes me so happy. It’s been a decade long effort.

I've been fully de-googled for years

some friends dont or cant avoid google so I install:

Gapps Browser by Toby Kurien if they want a de-googled phone

Its a sandboxed web apps for google stuff. So you can use google maps, gmail ext without having any google on your phone. Its on F-Droid.

Replace youtube with:

plenty of open source apps to view youtube

newpipe for android. no ads, no waiting, no data collection

or use an:

invidious front end instance.

I use the LibDirect addon for firefox.

you can set it to redirect away from most annoying big tech sites.

once set, you can choose many links, you choose, if you click any link it autoimatically rediects to the site you want:

I have set it like this:

reddit goes to https://reddit.simo.sh

youtube goes to https://invidious.private.coffee

goodreads goes to https://biblioreads.ml

IMDB to goes https://libremdb.iket.me

Stack overflow goes to https://libremdb.iket.me

a much cleaner experience


I went to school out East and work out west, and consumer tech is way more of a NYC thing than a Bay Area thing because the personas and professional networks are out there.

West Coast is also enterprise or business tech driven, but those founders aren't as media friendly or sexy despite being the majority (hence the Musks and increasingly Altmans hogging the limelight).

Boston has potential, but it honestly isn't leveraging it. The elitism is rife to a level unlike in California. A NU, BU, or UMass Amherst founder isn't going to be in the same circles as the Harvard and MIT founders who can leverage the I-Lab or Engine and HBS+Sloan resources, but in the Bay Area, a UCB, UCSC, SJSU, and Stanford kid will all be in the same professional circles. CIC tried, but they are trash. At one point, most startups in Greater Boston were basically Israeli companies using it as a US HQ because of the El Al direct and the large Israeli diaspora (throw a rock and you'll hit a Cafe Landwer).

Everything is tied up to elitism and old structure institutions out East (where did you study) while out West it's much more output driven (where do you work).

Works well for it's biotech innovation space though, which Boston is known for.

(ironically, I liked DC except the humidity - way less stick up their butt, but they also have a bohemian streak)


The issue is not lack of raw bandwidth, it's getting the hardware, software, drivers and OS "to do the right thing".

PCIe gives you the building blocks of posted transactions and non-posted transactions but doesn't help you use them effectively. There is no coordinated or designated DMA subsystem to help move data between the root-complex("host") and end-point("device)".

So, if you have to design a new PCIe end-point (target in original PCI terms) using an FPGA or ASIC then trying to actually sustain PCIe throughput in either "direction" isn't trivial.

Posted transactions ("writes") are 'fire and forget' and non-posted transactions ("reads") have a request/acknowledgement system, flow-control, etc.

If you can get your "system" to use ONLY posted writes (fire and forget) with a large enough MPS (payload size), usually >128 Bytes, then you can get to 80%-95% of theoretical throughput (1).

The real difficulty is if you need to do a PCIe 'read' this breaks down into a read-request (MRd) and a Completion with Data (CplD). The 'read' results in a lot of back and forth traffic and tracking the MRds/CplDs becomes a challenge (2).

Often an end-point can use 'posted writes' to blast data to the PCIe root-complex (usually the CPU/host) maximizing throughput since a host usually has hundreds of MegaBytes of RAM to make use of for buffers. Unfortunately to transfer data from the root-complex(host) to the end-point(device), the host usually will have the device's DMA controller initiate a 'read' from the host's memory which results in these split transactions since end-points don't often carry hundreds of MB of RAM. This also means bespoke drivers, tying into the OS PCIe subsystems and hopefully not loosing any MSI-X interrupts.

To re-iterate in the modern "Intel way" the CPU houses the PCIe root complex but does not house ANY DMA controller. So to get "DMA" working means each PCIe end-point's implementation has some kind of DMA "controller" which is different than the DMA controller of all other end-points, rather than Intel having spec'd out an "optional" centralized a DMA controller in the root complex.

1: https://cdrdv2-public.intel.com/666650/an456-683541-666650.p...

2: https://www.intel.com/content/www/us/en/docs/programmable/68...


Full Disclosure: I'm a cofounder at Voltron Data.

This is such a good post. I'm pretty humbled by your words about us being "everywhere that's of interest" and that "we're highly respected." It's hard to see that when you're in the weeds, so I just wanted to say I appreciate it.

Regarding proprietary…I get it. I was the CEO of BlazingSQL, and we were fully OSS with an open-core model. The number of Fortune 500 customers that were deploying us at scale but not paying us in money, feedback, or testimonials was honestly heartbreaking.

When Josh (our CEO) and I were in the early days of Voltron Data, we thought maybe we could hold ourselves accountable to the open-source community with a new model, which we now call open-periphery, where, as you said, the interchanges, standards, and protocols are open, allowing companies and developers to build resilient, evolvable data stacks.

Open-periphery also means we don't have to debate what goes back to the community and what goes into the proprietary code because there is such a clear delineation. Open-periphery is our way of thinking about OSS business models, and it's the solution we came up with to ensure we can continue to invest in open-source and next-generation query engines.


i'm always too paranoid that what I think is the Big Bad (https://tvtropes.org/pmwiki/pmwiki.php/Main/BigBad) is actually just another Dragon (https://tvtropes.org/pmwiki/pmwiki.php/Main/TheDragon)

or if I know I'm at The End, how should I know how many stages the Final Boss has (https://tvtropes.org/pmwiki/pmwiki.php/Main/TrueFinalBoss)

my pockets during the final cutscenes may be stuffed to the gills, but at least I made it

> For instance, using a ‘Speak with the Dead’ scroll on a certain suspicious corpse unveiled a questline I would have otherwise missed.

this is a pretty common scroll, you can buy them from most stores, and later learn a spell to use it constantly

if you want to be brave, blow that Orge Horn (https://bg3.wiki/wiki/Lump%27s_War_Horn) during a fight with like three mud monsters


Not playing everywhere is an existential risk. Because IT is so integrative, if anyone gets a huge leg up in any sector, Apple risks getting shut out in a lot of other sectors.

Spatial Computing/iVision is, for example, a claim in vr. It gives them some exposure to the market, some ability to extend & integrate their existing application/computing ecosystem into this medium. Ditto for all the other pieces; they're all integrative. Smart speakers work with airplay. iWatch works with iPhone works with iOS. The close integration lets them rebuff innovation in any other field: no one small can come along and build the next VR headset or the best watch to compete with Apple in any of these sectors, because no one can integrate like Apple. No one else has all the products. You have to have complete over the horizon horizontal control to keep your intense market power, and Apple is invested above all in there never being a chink in that armor, in making sure they can completely dictate the shape of all products by producing & owning all the products themselves. This is Apple.

Hence, Apple has to dabble everywhere. It maintains the most, it prevents real competition from forming, and it earns them a couple % revenue here and there to boot.

Expressing an entirely different brand of cynicism, I'd also ask: where else could Apple look to expand into? None of these sectors has been a runaway success. But it's not like Apple's missing the boat on some massive new tech that a huge new Total Addressable Market. It's been absent from Crypto and AI but generally it's just expanding wherever there's any opportunity, and why not when you have the cash & when any sector could become huge?


This adds more to the evidence that Vulkan / DX12 seems like a failed API design, when too many graphics engineers are reinventing the wheel by building their render-graph API on top of these (propietary triple-A game engines, Unreal Engine, and now Godot...) Instead of real-time graphics APIs providing all these low-level manual synchronization primitives and cumbersome PSOs, maybe they should just provide an official Render Graph API instead? Provide all of the operations and its dependencies in an acyclic graph up-front, and the driver handles synchronization automatically in the most performant manner tailored to the hardware.

I guess there needed some trial-and-error in the gamedev world for about a decade to really nail down a nice to use but also performant graphics API design. Vulkan being originated from the Mantle API from AMD didn't help - since it was a low-level console API mainly accustomed to AMD's GPUs and really didn't seem like it would fit for a more "general-purpose" API spanning a huge range of hardware and can stand the test of time. And with Microsoft's DX12 hastely copying from AMD's initial design it also has all the same issues (The irony is that DX11 is still the best graphics API you can use in gamedev in Windows in terms of ergonomics and even performance - seeing many trying to dauntingly build a DX12 backend and end up performing worse than DX11...)

Nowadays I'm obversing that the industry has known these issues for a while and are experimenting with alternative API designs... there are some experiental render-graph extensions available in both DX12 / Vulkan (albeit in a limited fashion):

- DX12's Work Graph API as preview (https://devblogs.microsoft.com/directx/d3d12-work-graphs-pre...)

- Vulkan's VK_AMDX_shader_enqueue extension (https://gpuopen.com/gpu-work-graphs-in-vulkan/)


The Computer Revolution hasn't happened yet, OOPSLA 1997 keynote

https://www.youtube.com/watch?v=oKg1hTOQXoY

Others have already mentioned The Early History of Smalltalk, highly recommended. You'll probably want to read it a couple of times, revisit from time to time.

The big idea is messaging, or rather "ma"

http://lists.squeakfoundation.org/pipermail/squeak-dev/1998-...

"The key in making great and growable systems is much more to design how its modules communicate rather than what their internal properties and behaviors should be."

"I think I recall also pointing out that it is vitally important not just to have a complete metasystem, but to have fences that help guard the crossing of metaboundaries."

" I would say that a system that allowed other metathings to be done in the ordinary course of programming (like changing what inheritance means, or what is an instance) is a bad design. (I believe that systems should allow these things, but the design should be such that there are clear fences that have to be crossed when serious extensions are made.)"

"I would suggest that more progress could be made if the smart and talented Squeak list would think more about what the next step in metaprogramming should be -- how can we get great power, parsimony, AND security of meaning?"


The idea of computing as the shared stage to reflect our own intelligence is really what sticks out to me as the best way to frame what interacting with a computer means. It's not new but Alan did a great job of motivating and framing it here. Thanks for posting this great reminder that what we use as computers today are still only poor imitations of what could truly be done if we can transport our minds to be more directly players on that stage. It's interesting to reflect the other way as well. If we are the actors reflecting a computer to itself. An AGI has to imagine and reflect in a space created of our ideas. To be native the AI needs better tools, the "mouse" of it's body controlling the closed loop of it's "graphics", how do we create such a space that is more directly shared? Dynamically trading been actor and audience in an improvisational exchange? This is the human computer symbiosis I seek.

The argument is in the premise, a powerful tool used by people sophisticated in public messaging:

The premise here is that the status quo beliefs are knowledge, ideas that disagree are 'activism' - political activity - and are not knowledge. It's fundementally a higly conservative framing that, obviously and intentionally, protects the status quo power.

Once you accept the premise, everything else follows - assuming X is political activity, and school is for learning, not so much for political actitivty, then obviously .... The effective means is not to make an argument directly for the premise - that makes the premise an issue on the table, something to debate. Instead, assume it in your argument: Stop this politicized liberal activism! See how that works? Even people who disagree have to figure out what they disagree about and construct an argument.

In reality, neither idea is more political than the other, and the university, of all places, is where to explore and develop new ideas. Teaching the status quo idea is just as much indoctrination as any other idea - maybe more, because it doesn't raise the question of challenging itself. The real question is, how are they taught. And regardless, I trust the students at Yale are smart enough not to be so easily manipulated - just like the people on HN, of course, who know so well what's good for the students.


because large companies value repeatable mediocrity over actual insight

It looks cursed because it is cursed. It's the classic "back to the basics" web technology nonsense that is hello world optimized and completely unsuited to any real world business purpose.

These seemingly simple template languages that promise an escape from the "unnecessary" bloat and complexity of the modern web are doomed to fail. Any level of adoption will drive deeper usage, which will drive more complex use cases, which will force more bolted-on general programming language features. Eventually your users discover (the hard way) why all the complexity in modern web frameworks exists, but now they're stuck programming in a shitty template language that is underbaked, awkward to use, has no ecosystem, shitty tooling and some rancid expression DSL you have to memorize. Cursed indeed.


Does it mean he is creating another for-profit company in addition to the non-profit OpenAI? Would it (eventually) lead to a conflict of interests?

The Tao Te Ching starts with this line, which I adore:

> The tao that can be told is not the eternal Tao

“The Tao” here means “the rules / way to live your life”. Essentially, the point is that if you try to write down a set of complete rules for how to act - either in life generally or at work, well, that ain’t it.

This is always my problem with OKRs. They’ll always inevitably lead you away from what you actually should / need to do in a given moment if you were in tune with your wisdom. Instead of trying to codify what leaders do, we should practice staying present to what’s actually going on and practice wisdom - for whatever that means in the current context.


I had such a great company social night, talking with an engineer who got his start here ~5 year ago.

They were taking about how they just want people to have a well formed idea of what to build, to be able to hand off clear expectations, and let him roll. For curiosity sake I started asking about other times: has there been a time where you've felt on the line, been the one who has to figure out what to do?

They paused for a bit & then said yeah, actually... They had been battlefield promoted after two higher ups on a team had left & it was just them running this product. They said they had little idea what they were doing but the company trusted them & let them hack through it. They loved that time. Finding out what to doz being given problems and the freedom to solve it was a highlight of their life, they said.

A lot of people don't want to play the game. Especially when we are forced to collaborate with non-technicals, it's incredibly hard to justify and explain ourselves & to share power & compromise with these people who lack competency to judge, assess, negotiate.

The title here omits the gods truth as an option: we the engineers know & can assess & you the business/product don't have the technical chops to debate, nor do you understand what is to be built. The premise presented is "debate vs do" as they say it, as though product and product alone understands do. But I think most engineers live in pain and dissonance and sadness, feel an incredible impedance and struggle, because most businesses/product have only the faintest fragmentary fake propped idea of what do is. It's a fiction. And it's up to engineers to cobble together some vaguely competent rendition of the fairy tale nonsense product tells itself it's come up with and that engineers need to just do.

The phrasing here could not be more slanted. Run, engineer, run, from the dented terrors that would think their product sensibilities have fully flushed out the idea, that think there is no cause for "debate" or sussing out how really to do a things that think only to "do" what the master product says is necessary.


> but purist progressives in the Obama and now Biden admin pushed them away

Autocratic rulers like MBS deciding to cut up journalists/opposition political figures into tiny pieces with bone saws inside Saudi consulates didn't help matters. The whole Khashogghi incident really illustrated exactly what the Saudi regime thinks of rule of law and human rights of their own citizens when it's boiled down to the the barest essentials. US senators, congressmen, foreign service career people have taken note.

It's still worth noting that the Saudi military/air force/other armed forces are extremely large customers of US/NATO spec equipment and UK origin equipment.

It would be worth remembering that something like 85% of the 9/11 hijackers came from Saudi Arabia and there were very clear financial/funding connections from wealthy persons within the Kingdom to the pre-9/11 training program. Highly reputable journalists and intelligence sources have also extensively documented the Saudi funding sources that supported (and still support to this day) wahabbist madrassas in Pakistan and Afghanistan, the "V1.0" of the Taliban in the 1990s, and other fundamentalist salafist jihadi groups.

Invading saudi arabia for regime change instead of iraq in 2003 would have been much more logical if anyone in the US and UK had the fortitude to do it. It would have also been vastly more messy.

It's well known in people who study foreign affairs that Iran funds and arms Shia and shia-adjacent armed groups (Houthis, Hezbollah, etc). But this doesn't happen in a vacuum - to some extent this is the IRGC and Iran's reaction to the well documented and widely known Saudi support for salafist jihadism.

It's also well known and documented that the saudis have been investing vast amounts of their oil wealth in the US stock market, real estate and other equities since the mid 1960s, so the financial and interconnected realtionship between the US and Kingdom would be extremely difficult if not impossible to dis-entangle at this point in 2024.

Despite the Khagoggi affair and other problems descrived above, I think it's pretty clear that US decision makers still consider saudi arabia a much more trustworthy regional "partner" compared to Iran. Ongoing US/UK contractor support of all of their armed forces (and US/UK relationship with Saudi Aramco) and ongoing exports of munitions to saudi arabia back up this theory.


Fundamentally the 'three round' model was about the risks, first was execution risk (can the team actually build what they say they can?), the second was market risk (will the market see enough value in the product to pay enough for it to give the company at least 33% net margins for R&D/growth?), and then round three was scaling risk (what is the addressable market for this product? How much has been reached so far? How much more could be reached?)

After those three rounds you are a going concern with hopefully 10 - 20% market share and an IPO will repay the investors, give you a regular way to raise capital on the open markets, Etc.

Way too many 21st century start ups were investors scamming other investors :-)


Great conversation. The best part IMO was Patrick's statement about the importance of the Internet. Agree 100% and never heard anyone express that view so clearly.

"I have an ebullience and love for it in a way that people in our social class in the United States are aggressively socialized out of having ebullience and true love for anything. I think that the internet is the capital G, capital W, Great Work of the human race in a lot of respects, that it is magical. It is an encapsulation of the best things about our society that is also tremendously, instrumentally useful in making all the good things better and ameliorating all the problems over sufficiently long time scales.

It seems naturally to me that this is extremely important. This is extremely valuable, and it seems extremely underrated by almost everyone, including people who would consider themselves great fans of the internet but say, “Oh, I’m a great fan of the internet, but I’m a great fan of penicillin, too.

I think, in aggregate, the internet is obviously more important than penicillin by many orders of magnitude. Is it more important than medicine? I will bite that bullet. The internet is more important than medicine, the entire institution of medicine, from time immemorial to reasonable extrapolations of what we can do right now. Is it more important than writing? You couldn’t have the internet without writing, so writing was very important to get to the development of the internet. That might be one of the most important things about writing, that writing got us to the internet. That sounds like a little bit of . . . I know people will take that full quote and say, “Oh, this crazy, nonintellectual person,” but I think that there is a reasonable case for it."


It reminds me of an obsession I had when I was young (maybe 12 or 13) where I kept iterating on a design for a mini-sub I had hoped to build. I must have checked out books on the history of the submarine about that time and became obsessed with the simplicity of the original Turtle submarine — operated with hand screws (propellers).

Likely too I saw a homemade sub or scuba tow on the odd Popular Mechanics cover....

I had read enough to incorporate a lead ballast that could be released from inside the sub. I imagined props and motors based around those electric trolling motors you can get for a small fishing boat. I therefore incorporated a car battery into the design. Front and rear ballast tanks allowed me to control the pitch trim. I imagined a small electric automotive tire pump would suffice to force the water out of the ballast tanks.

I obsessed over a mechanism to allow each trolling motor to be gimbaled from a pair of joysticks in the sub. I built mechanical models with paper drinking straws and toilet paper rolls to test the mechanics.

I played with different seating configurations to minimize the size of the sub but keep it "operatable".

It was a weird and impossible fantasy that never had a chance of moving beyond the drawing board stage. You know, especially for a kid with a single mother who was a secretary. But perhaps there was some intellectual and creative stimulation that I was feeding off at the time that made the effort worth it.

Thinking about it now though, how obsessive I was, it might also have spoke to a boredom, isolation and maybe sadness I felt at the time. The sub might have been an escape for me.

To see someone build a sub for real is kind of cool. But it also makes clear how likely my design would have just collapsed right away at about 10 feet depth. I mean, I planned on using plywood for the hull, ha ha.


"The Making of the Atomic Bomb", by Richard Rhodes, best nonfiction book I ever read. Covers German and Japanese efforts, as well as giving a history of modern physics, beginning way back in the 19th century.

Did you know that Einstein was strikingly muscular, and at one point in his life had been deeply religious, until deciding that much of religion was "lies"?

There is also a frightening history of World War I, of Jews in Europe, and biographies all the scientists involved in the US nuclear effort. Totally amazing.

I also read "Dark Sun: The Making of the Hydrogen Bomb", and "Masters of Death: The SS-Einsatzgruppen and the Invention of the Holocaust". The latter is deeply, profoundly, sickeningly graphic, and contains more information than you ever knew existed.


Oh boy, an opportunity to flex my AgSci degree oh HN! The invention of the HB process is more akin to a planetary credit card loan. It has done absolutely nothing to increase the Earth's *genuine* carrying capacity and (IMO) trapped humanity in insurmountable ecological debt. It has directly accelerated the mass-depletion of countless other finite resources such as water, healthy top soil, micronutrients, and more. The fact that it has allowed humanity to grow so much without consideration for other finite factors has also contributed to several secondary "loans" which have further trapped us in debt. Most notably, humanity's dependence on mono-culture farming which ravages biodiversity, creates super pests, and zaps soil health. You simply cannot feed 8 billion people with organic farming methods and the current energy cost of controlled-climate hydroponics makes it impractical at scale. As you mentioned, all of this growth also contributes to humanity's collective appetite for, well, everything. Plus the HB process is responsible for 1.4% of global emissions, giant oceanic dead-zones caused by runoff, increased acid rain, less overall nutritious foods, and so so much more. It's a planetary catch-22. Without it billions will starve, but with it we continue to charge towards a mass-extinction event.

Tabbed isnt enough for me. I don't want an app. I want a website. I want my user-agent! My user agent is what I know, affords me URLs & extensions and a powerful UI with lots of options.

The PWA thing seems sick to me. Henry Ford listens to market research and rebuilds his car as a mechanized horse. For me, the UX is strictly worse in every way. (And I already knew how to put links to webpages on my home screens).

There does seem to be a browser display mode, but it's up to the app maker to decide for the user what mode the app will be in. Why?!

> Progressive Web Apps can run in various display modes determined by the display property in the web app manifest. Examples are fullscreen, standalone, minimal-ui, and browser.

It makes me so sad how much lesser a person a user of a PWA is. The utter lack of user agent, being cast to whatever is provided by the maker, is a horrifying loss. All to ape what felt to me like the descending losing old-guard technologies.


We push PWAs to iPads & Surface Go devices via Microsoft InTune for some of our clients today.

This path started out very nightmarish (circa 2020) but it's going much smoother today. One of our customers actually came back to us with a slightly improved process based upon the one we gave them. They switched from iPad to Surface Go and used some extra endpoint management to make the PWA experience into a sort of kiosk mode.

The #1 constraint for us is the quality of the environment-facing camera and the level of access we have to its capabilities via the browser. iOS/Safari started out extremely weak on this but is quite good today. I can get a solid 2k environment scan at 30fps from the rear facing iPad camera in Safari today. Things like 2D barcode scan and document capture are 100% feasible now. These items used to make us extremely nervous on product demos but we don't worry anymore.

We almost capitulated and went back to native iOS apps because of the camera issues, but the pain of maintaining a native build chain when you are otherwise a 100% Microsoft shop (with barely 3 developers) was pushing back hard too. We were signing enterprise IPAs for all of our clients for half a decade before we switched to web/PWA. I will never go back to native apps. I'll find a different career path and hobbies if the web goes away.

I don't have a clean answer for B2C other than... I use HN and Twitter in Safari and I don't even process that it's not a native app. Neither of these web properties had to spend a single second worrying about a native app to acquire my business.


The exact way in which git handles commits is very muddied - it's snapshots on the surface, a bit of diffs when packed and a lot of operations on commits are actually 3-way merges (including merges, rebases, cherrypicks and reverts). Keeping track of all these matter (esp the operations that use diffs), but it can also get overwhelming for a tool.

In my opinion, it's probably good enough to understand the model git is trying to emulate. Commits are stored more or less like snapshot copies of the working tree directory with commit information attached. The fact that there is de-duplication and packing behind the scenes is more a matter of trying to increase storage efficiency than of any practical difference from the directory snapshot model. Meanwhile, the more complex git operations (merges, rebases, reverts, etc) use actual diff algorithms and 3-way merges (way more often than you'd imagine) to propagate changes between these snapshots. This is especially apparent in the case of rebases, where the snapshot model falls completely on its side (modifying a commit will cause the same change in all subsequent commits).

This actually makes sense if you consider the development workflow of linux kernel before git. Versions were directories or on CVS and a lot of development was based on quilt, diffutils and patchutils. Git covers all these use cases, though it may not be immediately apparent.

Added later: It's also interesting to look at Mercurial's model. Like Git, Mercurial uses both snapshot and diffs for storage. But unlike the Git way of layering these two, Mercurial interleaves them - as diffs with full snapshots occasionally. This is more like the video codec concept of keyframes (I think that's what inspired it). This means that Mercurial, unlike Git, doesn't need repacking. And while Git exposes its internal model in its full glory, Mercurial manages to more or less abstract it away.


Most of my direct relatives work for Bath Iron Works. Many as fitters. I wont say much more. The article ain't wrong. Who is going to change it?

"Here's my favorite example, here: 1928, my hero, Walt Disney, created this extraordinary work, the birth of Mickey Mouse in the form of Steamboat Willie. But what you probably don't recognize about Steamboat Willie and his emergence into Mickey Mouse is that in 1928, Walt Disney, to use the language of the Disney Corporation today, "stole" Willie from Buster Keaton's "Steamboat Bill."

It was a parody, a take-off; it was built upon Steamboat Bill. Steamboat Bill was produced in 1928, no [waiting] 14 years--just take it, rip, mix, and burn, as he did [laughter] to produce the Disney empire. This was his character. Walt always parroted feature-length mainstream films to produce the Disney empire, and we see the product of this. This is the Disney Corporation: taking works in the public domain, and not even in the public domain, and turning them into vastly greater, new creativity. They took the works of this guy, these guys, the Brothers Grimm, who you think are probably great authors on their own. They produce these horrible stories, these fairy tales, which anybody should keep their children far from because they're utterly bloody and moralistic stories, and are not the sort of thing that children should see, but they were retold for us by the Disney Corporation. Now the Disney Corporation could do this because that culture lived in a commons, an intellectual commons, a cultural commons, where people could freely take and build. It was a lawyer-free zone."

-- Lawrence Lessig, "Free Culture", OSCON 2002 (https://youtu.be/uH4RskpUFiA?si=IHVC72F4oXpLHJVV&t=253)


I lived in Iceland for several years recently. One thing you have to keep in mind, is this philosophy is pretty much born out of necessity. Until the US Military built a base in Keflavík (which later became an international airport), the country was quite behind the times. As one friend put it to me "When you were landing on the moon we were getting running water installed"

Iceland was a very poor country for most people other than a few well off in the fishing industry. Today it's still relatively "medium" with few people very wealthy, but also almost no one in destitution. The attitude is that everyone should be able to live a good life, and while it's not truly socialist, it's pretty darn close.

Þetta reddast is used somewhat interchangeably with "I don't feel like dealing with this" and "Some how things will work out even though it doesn't look like it now" and "fuck it". To most people it's kind of an in joke you say when something really sucks.


(Founder of windmill.dev, the closest alternative to Airplane and we are OSS)

Congrats on the acquisition Airplane team. You were a strong inspiration for us, a precursor and set a high-quality bar for pro-code developer platforms. I have nothing but respect for the Airplane team and we probably wouldn't exist in our current form without your competition.

We are ready to migrate all Airplane customers and the migration would be very smooth as many of Airplane concepts map 1:1 to our own concepts. If you need urgent migration, ruben@windmill.dev

One customer, nocd, migrated hundreds of scheduled scripts and workflows in just a few weeks and with only minor changes.

Our platform being fully open-source, you will never be at the risk of us sunsetting anything since you can fully self-host it (it is not an hybrid deployment model like Airplane where the Control plane are in the cloud). We are used by thousands of businesses including a few F500 at scale and can send reference over emails.

We are a smaller team, have raised reasonably and are close to break-even. As an open-source product, I keep in mind to resist the urge of raising too aggressively so we can keep control of our destiny and never betray our open-source principles of transparency and fair pricing.

(note: we have an example repo of the folder layout one can use to be backed by git: https://github.com/windmill-labs/windmill-sync-example, it's not all that different from airplane. See our CLI guide here: https://www.windmill.dev/docs/advanced/cli)


It’s certainly been interesting watching the multi-decade arc play out. With Mach as the origins, everything other than tasks (processes) schedule and virtual mem was out of kernel and done over Mach port comms. Then xnu via next step and later OS X linked much more in kernel and exposed specific data types using com+ in iokit. And now more and more is moving back out of the kernel.

io_urging networking on Linux is another similar move out to use space


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: