Hacker Newsnew | past | comments | ask | show | jobs | submit | more grumbelbart's commentslogin

Mandatory: We should build it in space and beam the electricity back to earth using electromagnetic waves. We could collect those using solar cells. And then get rid of the plant and use the sun instead.


Yes - exactly. No need for a Dyson sphere, or a man-made sun - just use the real sun and solar panels!


Long-term this would be done using LLMs. It would also solve LLMs' code quality issues - they could simply proof that the code works right.


> simply proof that the code works right

Combining LLMs + formal methods/model checkers is a good idea, but it's far from simple because rolling the dice on some subsymbolic stochastic transpiler from your target programming language towards a modeling/proving language is pretty suspect. So suspect in fact that you'd probably want to prove some stuff about that process itself to have any confidence. And this is a whole emerging discipline actually.. see for example https://sail.doc.ic.ac.uk/software/


Maybe very long term. I turn off code assistants when doing Lean proofs because the success rate for just suggestions is close to zero.


That is correct. The permutations will likely break up into multiple cycles, and the cycle lengths follow a poisson distribution.

https://dms.umontreal.ca/~andrew/PDF/CycleLengths1.pdf


There are optical accelerators on the market that - I believe - do that already, such as https://qant.com/photonic-computing/


That's a misconception. Earth is still pretty much in an equilibrium and emits as much as it gets from the sun. Global warming is due to the heat staying a bit longer in the system, not due to emitting less.


If the world is warming up it is not in equilibrium, or an I missing something?


If I have a tap flowing into a bucket, then once the bucket is full the amount of water flowing over the top will exactly equal the amount of water flowing in from the tap.

If I increase the size of the bucket, the the amount of water flowing over the top will also exactly equal the amount flowing from the tap.

There's also scale to consider: global warming is more like if my bucket was very soft, and I've stretched the plastic a little bit to give it slightly more volume (the analogy breaks down beyond this point).


I don’t understand the analogy. What’s the water and what’s the bucket?


Drilling is essentially an O(N^2) method. You need to replace your drill bit every X meters, and the time it takes to replace it about linear in the current depth.


That's just economy of scale, though. It's always more expensive to be the early adaptor. In Switzerland, 15% of all buildings are heated using geothermal heat pumps.


Yes. And in Switzerland, I believe most new houses have some other type of heat pump (drilling for geothermal is not allowed everywhere, or too expensive). This all still needs electricity; but many houses now install photovoltaics. (At least where I live.)


This is completely unrelated to the article, which is about geothermal power, not ground-source heat pumps.


I mean heat pumps where you drill around 200 meters deep. Whether you want to call those geothermal or ground-source, that is up to you I guess.


This is completely unrelated to the article, which is about geothermal power, not ground-source heat pumps.


> Would the insulation also make it sound-proof?

Not necessarily. Thermal insulation uses light materials, sound insulation requires heavy materials.


I was once working in a company producing software / operating systems for smart cards (such as the chips on your credit cards). We developed a simulator for the hardware that logged all changes to registers, memory and other states in a very large ring buffer, allowing us to undo / step backwards through code. With RAM being large, those chips being slow, and some snapshotting, we were usually able to undo back to the reset of the card. That was a game changer regarding debugging the OS.


So maybe we have different definitions of "time travel". But I recall that

- if a compiler finds that condition A would lead to UB, it can assume that A is never true - that fact can "backpropagate" to, for example, eliminate comparisons long before the UB.

Here is an older discussion: https://softwareengineering.stackexchange.com/q/291548

Is that / will that no longer be true for C23? Or does "time-travel" mean something else in this context?


There may be different definitions, but also a lot of incorrect information. Nothing changes with C23 except that we added a note that clarifies that UB can not time-travel. The semantic model in C only requires that observable effects are preserved. Everything else can be changed by the optimizer as long as it does not change those observable effects (known as the "as if" principle). This is generally the basis of most optimizations. Thus, I call time-travel only when it would affect previous observable effects, and this what is allowed for UB in C++ but not in C. Earlier non-observable effects can be changed in any case and is nothing speicifc to UB. So if you call time-travel also certain optimization that do not affect earlier observable behavior, then this was and is still allowed. But the often repeated statement that a compiler can assume that "A is never true" does not follow (or only in very limited sense) from the definition of UB in ISO C (and never did), so one has to be more careful here. In particular it is not possible to remove I/O before UB. The following code has to print 0 when called with zero and a compiler which would remove the I/O would not be conforming.

int foo(int x)

{

  printf("%d\n", x);

  fflush(stdout);

  return 1 / x;
}

In the following example

int foo(int x)

{

  if (x) bar(x);

  return 1 / x;
}

the compiler could indeed remove the "if" but not because it were allowed to assume that x can never be zero, but because 1 / 0 can have arbitrary behavior, so could also call "bar()" and then it is called for zero and non-zero x and the if condition could be removed (not that compilers would do this)


I think the clarification is good, probably the amount of optimizations that are prevented by treating volatile and atomics as UB barriers is limited, but as your example show, a lot of very surprising transformations are still allowed.

Unfortunately I don't think there is a good fix for that.


E.g. this godbolt: https://godbolt.org/z/eMYWzv8P8

There is unconditional use of a pointer b, which is UB if b is null. However, there is an earlier branch that checks if b is null. If we expected the UB to "backpropagate", the compiler would eliminate that branch, but both gcc and clang at O3 keep the branch.

However, both gcc and clang have rearranged the side effects of that branch to become visible at the end of the function. I.e. if b is null, it's as if that initial branch never ran. You could observe the difference if you trapped SIGSEGV. So even though the compiler didn't attempt to "time-travel" the UB, in combination with other allowed optimizations (reordering memory accesses), it ended up with the same effect.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: