That's a really profound question that deserves serious research, IMO. Communication is easy enough to define using Shannon's theory of information. I would argue that computation is subset of communication: signals go in one end of a 'processor' & come out the other (so the processor acts as a 'channel' in communication theory), but the data comes out modified in some non-trivial way, which distinguishes computation/processing of information from mere replication of the signals at the output.

A trivial modification would be something like (X,Y) -> (Y,X), just re-ordering cables. A non-trivial modification would be something like (X,Y) -> (X XOR Y, X AND Y). Both of these 'channels' have 2 bits of capacity, but intuitively only the second one is actually doing any information processing/computing.

I haven't found a satisfactory way to formalize my intuition about what is 'non-trivial' and to quantify how much 'processing' is being done by a general system, in such a way that one could meaningfully compare a brain and a CPU, for example. I think it's a great research question.

Yes. The most scientific way of talking about "order" in that sense is the Kolmogorov complexity, which is still extremely poorly-defined. The best way to put it is that "ordered" states are ones that have low Kolmogorov complexity.

Yeah, you also have to transform the "reference" function, and then the entropy stays the same. I prefer to think of it as the "density of states" -- it's necessary to make the argument of the logarithm dimensionless, after all.

>how physicists get around the discrete and finite restriction

By turning a sum into an integral. The probability 'density' is p(x), and the 'density of states' is n(x), so then entropy is then integral of p(x)log(p(x)/n(x)) over dx.

>It seems to say that "given a system we know everything about, there is no way to go to a system that has some unknown things to us".

That's backwards: information is the negative of entropy. The 2nd law says that entropy never decreases, so information never increases (it can only be preserved or lost).

>I don't know what Entropy is supposed to mean on the level of individual states/configurations.

The entropy is a property of a probability distribution, not of a state. Entropy is defined as H = -sum(p_i log(p_i)). A 'state' implicitly defines a probability distribution: uniform probability over all the microstates compatible with the state description.[0] In the case of a microstate, the entropy of the probability distribution over microstates consistent with that state is zero - there's only one compatible state, so p_i = 0 for all other states, and log(p_i) = 0 for the compatible state. In the case of a macrostate, the entropy of the probability distribution over microstates consistent with the macrostate works out to -sum((1/N) log(1/N)) = log(N), where N is the number of consistent microstates. That's the Boltzmann entropy.

Sometimes people will write about the entropy of a 'state' in such a way that it sounds like they're talking about the entropy of a microstate -- but what they're probably talking about is "the entropy of the macrostate that this microstate belongs to." It's sloppy to talk like that, because "the" corresponding macrostate isn't unique. There are many sets of macrostates that could contain a microstate, depending on what properties of the microstates one considers 'macro.'

(Ex: 10100101 is a member of both "symmetric bit strings of length 8" and "bit strings of length 8 that average to 1/2". The entropy of "symmetric bit strings of length 8" is 4 bits, whereas the entropy of "bit strings of length 8 that average to 1/2" is ~6.1 bits. And of course, the entropy of "the bit string of length 8 that is exactly 10100101" is zero.)

There are a broad range of motivations. In my case:

- I wanted to work on nuclear fusion energy, for the future of humanity.

- I was told that if I had a PhD in physics, I could do just about anything - it's a way of keeping options open.

- I liked tinkering in the lab & learning new mind-expanding concepts in textbooks.

- I viewed it as a test of my intelligence. (Turns out, it's more about perseverance.)

I know at least one string theorist, and they seem to be motivated primarily by liking to mess around with abstruse mathematics. Some others seem to enjoy the 'nerd cred.' I think the ones whose sole motivation is 'getting to the bottom of the universe' probably burn out early, b/c there's so little of that to be had right now. (I've heard from several people who got their PhDs in particle physics who went on to do data science/programming, saying that the field is depressing and that's why they didn't pursue it further.) As for me, I found the 'shut-up-and-calculate' attitude a major turn-off to studying quantum physics. (Plasma physics uses little or no QM, so that worked out for me.)

Fusion researcher here. Yes, this piece is sensationalized, as are most pieces about fusion, and about science/technology in general, unfortunately. It's depressing that tokamak performance hasn't improved much in all that time. The only two ways to scale up the gain are to build the device larger (like ITER) or with higher magnetic field (like SPARC). The existing devices have been around for many years and have been pushed basically as far as they can go, the only way forward is to build new devices. There's really no reason to expect a breakthrough on tokamak performance that will suddenly change the game. The only really positive news for tokamaks in the last ~30 years is the advent of REBCO superconductors & the success Commonwealth Fusion has had so far in capitalizing on this technology.

CFS claims that it's manageable, although I don't know enough to evaluate that claim. It's also not clear from this quote if the once/twice per year is referring to full replacement of the vacuum vessel, or maybe just to inspection or replacement of a subset of components.

"Bob Mumgaard, CEO of Commomwealth Fusion Systems, regards neutron flux as part of a fusion power plant’s wear-and-tear—a routine aspect of its servicing cycle, happening perhaps once or twice a year. “We can simplify the internal components, develop maintenance scenarios,” he says. “We have such a scheme substantially in place.”"
https://nautil.us/issue/86/energy/the-road-less-traveled-to-...

Interesting. I wonder if personification is analogous to a "memory palace": it's a way to hijack our innate brain capacity for working on something more abstract. I've always felt like physics derivations were like mystery stories - the best ones had a surprise twist of reasoning that leads the detective to the solution from the scattered clues.

I like having static analysis for undefined names in my IDE for exactly this reason: I can write the program in the natural order (high level to low level), and then have a little reminder about what I haven't finished yet.

-Information-theoretic approaches to AGI (ex: total correlation explanation, free energy methods)

-Causal/generative/physics-based approaches to AGI

-Renewable energy & clean tech: battery systems modeling, demand/weather forecasting, or hardware projects

-Off-the-wall physics ideas: I'd be willing to listen to them & provide feedback. I'm personally interested in trying to reformulate quantum mechanics starting from geometric/Clifford algebra and Bayesian probability theory, as a way to make it more intuitive.

What's your take on the idea of on-orbit propellant gathering?[0] That is, the concept of a solar-electric propulsion sat in low orbit 80-120km, scooping up atmosphere, ejecting the nitrogen to compensate drag while storing the oxygen for a propellant depo. I tried to run some numbers on it and it seems like it might work, but maybe doesn't pay for itself fast enough to be worthwhile.