> Galactic spectra, which JWST started to send back in earnest at the end of last year, are useful for two reasons.
> First, they let astronomers nail down the galaxy’s age. The infrared light JWST collects is reddened, or redshifted, meaning that as it traverses the cosmos, its wavelengths are stretched by the expansion of space. The extent of that redshift lets astronomers determine a galaxy’s distance, and therefore when it originally emitted its light.
Won't a photon climbing out of a huge gravity well have a huge redshift, thus confounding estimates of distance from us and estimated age?
The photons that are getting observed are emitted from the accretion disk, which is not very close to the event horizon.
You can estimate how close to the event horizon the disk is based on how broad the spectral lines are. The part of the disk that is coming towards you will be blueshifted and the part of the disk that is rotating away will be redshifted. From that (which is independent of the overall depth of the potential well) you can figure out how fast the disk is rotating. And from that you can figure out how far away the disk is from the black hole.
Sometimes I despair for humanity [see advertising/social media/politics], but then I see something like this and think, we're doing pretty well for self-taught primates.
It’s pretty fascinating (and scary) to think of the disparity between the brightest among us that are in turn building on the shoulders of giants versus groups of people like outcasts that gather up for the most stupid reasons. No wonder we are facing challenges.
and then we see all of the belief systems to mock, punish, or worse people that are trying to expand human knowledge. looking back at history, you have to wonder where we might be now if there wasn't this attempts at mass eradication of learning. there are times i wonder if we're heading in that direction again. this time, rather than burning the Library of Alexandria or locking everything in catacombs, if we're not just trying to get there on a low-n-slow approach.
The STEM people will just need flashy robes, long grey hairs and market themselves as the magicians that can read the runes, command to the machine spirits of yore, and make the food grow. It's good fun to pester the nerds until the wifi is down.
Fantasy lit is full of stories about the good people rebelling against the evil wizards and their machinations.
Luke fighting Emperor Palpatine? What does Luke or any of the Rebels know about managing an org of Peta-scale population? Or what myriad threats to the galaxy the Emperor was handling with his century of experience, network of millions of loyal agents, and vast magical ability? I'm sure that will all be fine.
The emperor wasn't doing such a great job holding the galaxy together as a political unit, there was a successful overthrow of his rule after just a couple decades in power.
Reminds me of the book "The Last Ringbearer". It tells the story of "The Lords of The Ring" from the perspective of Sauron, Saruman, and the Orks. Turns out, Mordor was a nation in its early industrial stage, of people putting science before magic. Fanatics against science genocide them while painting them as monsters.
I mean it really depends on your personal experience to how your perspective or "where we are now" feels. It's not like everyone is even remotely concerned with or benefitting from this "advancement."
A lot of the so called progress has done nothing but destroyed and distort an enjoyable and peaceful existence.
I guess we might be better off if Pol Pot had killed more smart people than he did. Maybe Trump will win and finish off the rest of us and you can be happy.
> Won't a photon climbing out of a huge gravity well have a huge redshift
It depends; the key factor is not how "huge" the gravity well is (in terms of how massive the object is), but how close to the black hole horizon the light is emitted. The vast majority of the light JWST is seeing from black holes is not from very close to the horizon. It's from the accretion disk, which is much further from the horizon and so the gravitational redshift is much smaller.
> Won't a photon climbing out of a huge gravity well have a huge redshift,
Yes
> thus confounding estimates of distance from us and estimated age?
Not if you know, or can get a good estimate of, the potential well it's climbing out of.
That said, Brian Cox does sometimes joke that astronomers round π to 1; while I wouldn't know about the reality, it's probably safe to infer at least some frustration on his part about the precision of things in this field.
The joke was way more true at the 20th century. There were many really important measurements where we got unprecedentedly precision, enough to say it's X, 100X, or something in between.
Nowadays astronomy got a lot more precise. But there are disagreements on how much confidence to put on that extra precision.
sometimes, i wonder if astronomy/physics were to only use unsigned numbers, if things would just make more sense. you'd get much more precision, and then you wouldn't have to worry about "but the math says it's possible" issues by taking everything by * -1.
Well, Mr White, if you look at all of the "weirdness" in physics with theories coming out because "the math says it's possible", then you'd see that it would be much more simple if we were not allowed to have negative numbers because we're only using u64 integers. we'd have more digits for precision as well.
I find that weird outliers are usually evidence that the underlying model is flawed in some fundamental way, but sometimes I think people believe the exceptions are fundamental, and a lot of effort goes to the wrong place.
But I mean, space is really, really big and we are observing from one single spot with (on cosmological scale probably) primitive technology. So of course most of it is guessing and when you "guess" a lot of things, it maybe does not matter a lot, if you have 3.41 Pi or 1, when the data you have are rough estimates anyway. But sure, when you do sloppy math, when you could have precision - that would be just wrong and unscientific.
Then you're so close to 10 that you can just use 10.
I actually did that on a physics exam once. Somehow I had a value with pi squared. I just replaced it with 10. I don't remember if I got that question right or wrong - I'm hard pressed to think of a situation where pi squared actually legitimately shows up.
> Won't a photon climbing out of a huge gravity well have a huge redshift, thus confounding estimates of distance from us and estimated age?
Of course it will. It will also have a redshift related to the expansion of the space it's been travelling through. Both of those corrections are absolutely part of the model. It's not nearly as simple as "Astronomers forgot about General Relativity!".
But I guess it's true (to be clear: I'm just an amateur in this field) that the ΛCDM model for cosmological evolution that we've all been looking at for the past decade or two isn't holding up well at all. It looked like it was pretty much there and just needed some fine tuning. Then we got a bunch of new data and everything's a mess.
That's kind of exciting all by itself, though it's also leading to a bunch of nattering from the existing iconoclasts (MOND nuts in particular) whose theories are also not working very well to explain JWST observations.
New insight needed, basically. We're all watching for updates.
> Both of those corrections are absolutely part of the model
Is there a way to gain a better understanding of how these parameters are modeled and what the scientific evidence is for the various phenomena in astrophysics? It's somewhat perplexing to me as an outsider of the field to understand how things like mass and distance of stars and planetary bodies are determined when 1) the scales are so outside of conventional experience 2) observation is limited to 2D imaging of the night sky 3) the observations in general are not consistent with our knowledge of gravity and relativity without adding hidden parameters.
I know Terence Tao (yes, that Terence Tao) is working on a book about the cosmic distance ladder, but sadly it's not out yet. (I guess he probably has other projects he's working on.) But he does have some slides from talks he's given: https://terrytao.files.wordpress.com/2020/10/cosmic-distance...
> Won't a photon climbing out of a huge gravity well have a huge redshift
Yes. Though see the sibling comments on why the (remaining) gravity well the emitted photon has to climb out of is not particularly huge here.
> thus confounding estimates of distance from us and estimated age?
Generally speaking: No, because people are not stupid. :) See the integrated Sachs-Wolfe effect for a very similar situation (CMB photons traveling through gravitational wells) -> https://en.m.wikipedia.org/wiki/Sachs%E2%80%93Wolfe_effect
In quantum mechanics, a single particle is also a wave and vice versa. Light is in fact the thing where this observation was first discovered - it had been proven to be a wave for at least a few decades when Einstein discovered the quantum nature of the photovoltaic effect, proving it is also a particle. This discovery was the very start of quantum mechanics, in fact.
"in fact", that was not the very start of quantum mechanics. Einstein's work was preceded by 5-10 years by investigation into the cause of quantized black body radiation. that's why we have Planck's constant and the notion of quantized oscillators. why are you going around talking with such authority? QM and nature is more amazing than you realize. particles are not waves "and vice versa". particles don't exist nor do any classical waves of energy in the way you keep saying as if they're facts. only the wavefunction travels. please be more careful about posting so authoritatively. comments such as yours are why it takes people so long to stumble onto the reality of nature, if they ever are so lucky.
Your comment is a bit strange. Light doesn't travel as photons. Photons exclusively exist at the site and instant of detection of the wave of probability of detection that light really travels as.
When light is redshifted, it loses energy, therefore the wavelength becomes longer.
That is not the current understanding of quantum mechanics, as far as I know. Wave/particle dualism says that different experiments can either view light as a wave or a particle (never both) and that speaking about the nature of light when an experiment is not being performed is non-scientific by definition.
Importantly, light very much behaves like a conventional wave in many real experiments - the interferometer experiment being one of the oldest and most well known. It is not a probability wave in that case, but an actual physical wave (now known to be an oscillation in the electro magnetic field, but long assumed to be a mechanical wave in the luminiferous aether).
Experimentally one never observes waves. Light is detected based on its interaction with electrons and that is always by an electron absorbing a quanta of energy, not via some continuous process as would be the case with waves.
Classically one can imagine that as if electron was hit by a particle. But then we have light diffraction and interference, which classically is described as a wave. So from a classical point of view light travels as a wave but interact as a particle.
As of nature of the light, then consider that there is a reformulation of a classical electrodynamics that eliminates electromagnetic waves all together. There are only electrons that interacts with each other directly with no waves in between. Feynman spent quite some time trying to develop quantum electrodynamic based on that. He failed. Still the point stands that we never observe light directly but only through its effects on electrons and other charged particles. So it could be that what we call light is a theoretical artifact and there is no light in reality.
> So from a classical point of view light travels as a wave but interact as a particle.
And the classical point of view is wrong. Photons resemble classical particles in a few respects, and classical waves in a few others, but at the end of the day they're neither.
> Still the point stands that we never observe light directly but only through its effects on electrons and other charged particles.
This is true of literally everything. "Direct" observation does not exist. Every atom, every cell, every person, every planet, every star - you know them by their effect on your sense-data, or else not at all.
photons dont exist, dude, except in connection with and at the site of the detector. study some qft and then you can go talk about it on the internet with authority.
and yes direct observation exists. that's what measurement is. and that's all you can ever "observe" unless you incorporate the wavefunction, which also doesn't "exist".
I have. Srednicki and Weinberg are sitting on my bookshelf right now.
> photons dont exist, dude, except in connection with and at the site of the detector.
Exactly backwards.
Photons fall out of mode-expanding asymptotic EM field states just like any other particle. It's the interaction picture that can't be rigorously built up out of particle states.
good to know you're informed, but it doesn't mean we're communicating effectively.
what i said is not backwards unless you use the inverse understanding of "exist". and how can photons exist without interaction? they don't. they're localizations by detectors. until then, they only "exist" as the probability wave. that's due to nothing localizing them. that's how i mean "exist". so it's not really meaningful to use that word.
btw "any other particle" doesn't fall out of EM fields.
> btw "any other particle" doesn't fall out of EM fields.
I very obviously meant that XYZ-particle states fall out of mode expansions of the XYZ-field, not that all particles are EM quanta.
> what i said is not backwards unless you use the inverse understanding of "exist". and how can photons exist without interaction?
Exactly the same way any other field configuration can? The state space of the free EM field simply is the photon Fock space.
The state space of an interacting 4D EM field is unknown and may well not exist, hence the need for perturbative approaches and renormalization - in which particle states again emerge as terms in the perturbation series.
I'm pleased to talk with you given that you're familiar with the topic.
> The state space of the free EM field simply is the photon Fock space.
but I'm not sure what your point in mentioning this is. When people hear things like "particles exist between their emission and measurement", they might get the idea that there could be said to be any reality to the existence of particles in transit. They simply aren't particles the way people think of them - they are probability waves. A great example we could discuss is the HBT experiment in which photon bunching was seen. If the bunched photons were really "particles" by any sense of the word, they wouldn't bunch in time of detection merely by the fact they are entangled with each other. That would be like saying entanglement affects the flight of a particle which is simply not able to be said.
> They simply aren't particles the way people think of them
They're not anything the way laymen think of them. They're not classical particles, they're not classical waves, they're not classical anything. They're not really probability waves either, for that matter, or even field configurations - the real object is the algebra of observables. But laymen don't get to dictate the vocabulary of physics, and neither do mathematicians. Fields are the things that would be dual to the observable algebra, were it always equipped with a dual; particles are the things whose creation and annihilation operators would generate it, were there only always such things.
Well, you are still talking about models. The creation and annihilation operators are also mathematical structures. That does not mean, as you seem to imply, that "particle" physics cannot be conceptualized outside of QFT. QFT may be superseded someday and the physical systems we try to describe will still be exactly as they are. I'm not sure if you'll agree with me since I suspect you've got a reason not to.
so it's the tone of my textual words? i think you're mistaking my laughter at their blatant and willful ignorance for hostility, and quite honestly it sounds like you're projecting that hostility. from the start I've been incorrectly downvoted and critiqued while being the primary commenter in this child thread providing a semblance of correct view. to be quite frank i deserve an apology, not some nitpick that comes from your personal assumption about my tone. but i know i won't get one because it's not hostility you care about at all. a lot of you commenters here are pretty hilarious. but not in a good way.
At this point anyone who is as certain about the nature of light as you seem to be may well be right, but is demonstrating a level of confidence the literature does not yet seem to support.
Although there are those who believe that photons only exist at source and at the detector, there is some experimental evidence which contradicts that (for example, this 2013 paper in which researchers 'read' information from a photon without destroying it, which implies its continued existence between creation and observation: https://doi.org/10.1126/science.1246164).
For what it's worth, I am also of the opinion that your tone was hostile, and yes, 'textual words' can and do have a (metaphorical) tone. Regarding downvoting and critiquing, there's an old adage about if you walk into a room and it smells of dog shit, maybe someone in the room stepped in something on the way in, but if every room you walk into all day stinks, well maybe you should check your own shoes.
so, you're trying to tell me that such reasoning about "dog shit" applies to the ignorance of humanity in general? The reality is the very few people actually understood what they were talking about and a lot of people went around operating on what they didn't really know. So while you call me an asshole for correcting a number of incorrect people in this thread who are themselves rather overconfident, I have to again insist that because I am correcting them or destroying their delusions does not make me hostile and there's a very good chance that you're projecting that impression. The definition of hostility includes the intent to do harm or to be unfriendly, which is in fact what the people I am replying to are doing by spreading misinformation, as if they know. If that's too difficult for you to accept, then best that you do not reply to me, because it would be a waste in both of our time.
"falsehood travels a great distance in one night before the truth even crosses the threshold of the door". And the masses sentences Socrates to death just because he called them hypocrites.
It's a good thing I know these truths or I would have been damaged by you.
It still makes me sad though.
and for what it's worth I don't believe the paper you linked says what you think it says.
you are wrong, and you don't "know". the waves interfere with themselves unless they are measured. then the plate measures interfered waves in the form of singular photons, not waves. look it up.
This is super cool. I think it's funny how Quanta writes, "It is expected because JWST was built, in part, to find the ancient objects." "Ancient" is equivalent to "distant" because of special relativity. It's an odd thing, astronomy - it's a real world time-traveling observation, except we're limited in resolution and we can't look here only there. If it was a game mechanic I'd call it a cunning way to impose internally consistent limits on what the player can know.
> If it was a game mechanic I'd call it a cunning way to impose internally consistent limits on what the player can know.
But if you assume it's a multi-player game, wouldn't it be easier to implement a global space-time? The reality is that you have to keep an internal clock for each player.
In practice you need a global coordinate system. For example, you'd create a "spawn point" for all players, and then, before the game starts, move them to their starting positions at a fraction of c. The game is still keeping track of units within this initial coordinate system, and transform those coordinates for the player. The reverse is done for player decisions. There is effectively an in-game "star network" that mediates between player. Each player/unit has a unique light cone associated with it.
Nothing stops us from arbitrarily designating a particular reference frame as "preferred" in a simulation; SR merely says that such a choice is truly arbitrary. Note also that this system could work recursively, such that new units must always spawn at the spacetime location of older units, and do information management (behind the scenes) in a similar way. This would turn the network into a tree - and the transforms are relatively cheap (and we could use a Spacetime algebra library!)
Note that I don't think this works for a traditional RTS like C&C or Supreme Commander. But rather a game like Civilization, but galaxy spanning (and spanning very long periods of time), where the smallest objects are something like "suns" and "fleets". The individual lives of humans would come and go as these vast battles took place over thousands of years. You'd only need to add magic like a 1g constant acceleration drive - and a strong rule against using this to directly kill planets - and assume that the fleets are actually large-ish asteroids fitted with the drives and a human and/or robotic crew.
If it’s a simulation then there is a fixed point. Any proper game developer would also fake it for the most part e.g. culling things the players aren’t looking at, making distant objects static billboards and they’d also fudge the numbers with something like the cosmological constant.
Not the original author but my first interpretation is that this means, if it were in a video game, it would be a way to self-reinforce "fog of war" where you can only have extremely local information and you don't want extra information to leak to the players.
Wouldn't that be incredibly hard, considering that the players perspective is disconnected from the units?
I observe my base and order a group of units to go to some place on the minimap. I see them moving along on their way, and decide to scroll to them. Only to realize that they already arrived and got massacred by the enemy. I go back to my base only to realize that it was destroyed ages ago (in its frame of reference) and I'm dead. Game over. Thanks for playing SR Red Alert.
In ordinary strategies you already have an assumption of immediate information transfer from unit to player - you see what each unit is seeing. This can be kept so you will see you units moving in real time but as things get further from your units on the map, the older info about them you get.
That's interesting. Wouldn't it lead to paradoxes where enemy units are shown at multiple places at once, observed from different frames of reference? Would be incredibly confusing, but confusing games can be fun too.
EDIT: Also if a unit traveled close to C away from a friendly unit, and then back, whos reference is real?
Observations merging is interesting question. Simplest thing to do may be to show enemy unit as it is seen by our unit closest to them (here we just use "real" distance in internal game's coordinates). In other words, Voronoi diagram is built with our units as seeds, and enemies are shown as seen from their cell center. This would evade enemy "phantoms" but enemies can suddenly "teleport" as they move closer to other friendly unit (or can't they? I don't really know).
In lore it may be explained like "our units know distances to objects and choose closest one among all units" since we already use "units can communicate instantly" magic.
You’d have to have a HQ relative to which all dilation is calculated. If done correctly, it could simplify server operations since the need for simultaneity is relaxed.
The density of black hole decrease by the inverse square of the mass of the black hole. That means massive black holes have a much lower density that small black holes. So they are more likely to form than small black holes. Dark matter will have played an important role in the creation of those early black holes. If there is no dark matter and some form of MOND theory of gravity is correct, the Schwarzschild formula will require a modification for large black holes. In that case galaxy centers will not require large masses to see the same effects.
That means massive black holes have a much lower density that small black holes. So they are more likely to form than small black holes.
No, it doesn't, because you're ignoring all the physics required to get the mass into a small enough region so that it will collapse to form a black hole. The only way to make that happen that we know works is to form massive stars, where a fraction of the star's mass, in the center of the star, will eventually form a black hole. But the largest stars we know of have masses of ~ 200 times the Sun, and so can't form black holes more than a fraction of that.
If you imagine a more massive gas cloud collapsing under its own gravity, it will fragment into subclumps before getting very dense; these subclumps will themselves fragment or go on to form stars directly, but with an upper limit of, say, ~ 200 solar masses.
(It's possible that if you start with a cloud of pristine gas in the early universe -- nothing but hydrogen and helium -- that it might collapse to form a single supermassive star, or even a black hole directly. That might give you something like a 1000-solar-mass black hole. But that's still fairly speculative, and requires unusual conditions that don't exist generally.)
I think the claim is like this: You have the primordial universe, shortly after the big bang, with fluctuations in density. But the whole density is very high. A fluctuation over a large region could put the region over the threshold to become a black hole, because the density required for that to happen is lower than for a small region.
Mind you, I don't know if this actually works. What was the density of the early universe, compared to the density required to form a black hole? How large were the fluctuations? Is this scenario plausible at all?
I suppose that if you go back close enough to the big bang, then you can get a density high enough. But then, if you go back not much farther, shouldn't the whole universe have formed a black hole? And if it didn't, can we trust the logic that says that the situation should have led to the formation of giant black holes?
I suppose OP defines it as the mass of the BH, divided by the apparent volume taken up by the BH (more precisely: the apparent horizon), as seen from the outside. Put differently, for a Schwarzschild BH: Density ~ M/R³ (modulo constant prefactors) ~ 1/M², since the Schwarzschild radius is linear in M.
The event horizon is a three-dimensional null hypersurface, though, encompassing a four-dimensional spacetime "volume". You are probably referring to the two-dimensional apparent horizon, which depends on the spatial slicing.
Maybe I’m a bit naive here, but I thought finding of many supermassive back holes in the very early universe should be expected, and not a surprise. My chain of thoughts is following:
- Blackholes are collapsed masses. In our phase of the universe, they are normally the results of supernovae but only because today stars are the only source of dense masses.
- Because the total mass of the universe is constant and the early universe has a volume of dozen orders of magnitude smaller than the universe in later stage (the universe we normally observe), collapsed masses should happen much more likely than in later stage.
- Those masses do not necessarily come from the stars, which are masses with a visible fusion processes but could come directly from mass-lumps, left after the Big Bang.
- In fact, I think the idea of a super homogenous universe right after the Big Bang is suspect. Even the CMB is only homogeneous on (very) large scale (3000 mega parsecs), ways too big for blackholes.
- Therefore, I think after the Big Bang, there should be a large number of primordial blackholes.
- We can only observe blackholes with an accretion disk, but the primordial blackholes with no visible accretion will stay invisible unless there is a collision/fusion in progress.
Consequently, we will find a lot more blackholes in the early universe, the more we look. They are super massive because of the bias: only supermassive blackholes with accretion disk were producing radiation, powerful enough so we can detect with our current technology.
What do you mean by "many" black holes being "expected"? What's "many"?
You have to be specific, and to do that you model, and then validate through observation. The article states that more black holes are being observed than the models predict, which is their definition of "many".
> Because the total mass of the universe is constant
It isn't [1]:
> The point is pretty simple: back when you thought energy was conserved, there was a reason why you thought that, namely time-translation invariance. A fancy way of saying “the background on which particles and forces evolve, as well as the dynamical rules governing their motions, are fixed, not changing with time.” But in general relativity that’s simply no longer true. Einstein tells us that space and time are dynamical, and in particular that they can evolve with time. When the space through which particles move is changing, the total energy of those particles is not conserved.
> It’s not that all hell has broken loose; it’s just that we’re considering a more general context than was necessary under Newtonian rules. There is still a single important equation, which is indeed often called “energy-momentum conservation.” It looks like this:
μν
∇ T = 0
μ
> The details aren’t important, but the meaning of this equation is straightforward enough: energy and momentum evolve in a precisely specified way in response to the behavior of spacetime around them. If that spacetime is standing completely still, the total energy is constant; if it’s evolving, the energy changes in a completely unambiguous way.
> In the case of dark energy, that evolution is pretty simple: the density of vacuum energy in empty space is absolute constant, even as the volume of a region of space (comoving along with galaxies and other particles) grows as the universe expands. So the total energy, density times volume, goes up.
There remains doubt about CMB inhomogeneity at the smallest scales - basically the error bars at the right of this graph, and the region off the chart to the right, which is yet to be measured:
A naive view would be that a turbulent Big Bang has turbulence all the way down, with many fine-grain clumps and vortices, giving rise to early black holes.
Inflation is invoked to create a homogeneous CMB, but that begs the question, and cannot be used to rule out small-scale inhomogeneities.
This makes a lot of sense. In those first few seconds, that incredibly hot, dense soup of matter would've immediately "pitted" through with absolutely enormous black holes, and like rain, those black holes were probably the "seeding" nuclei for the remaining matter to clump around and form galaxies. Which suggests that galactic core supermassives are drifting leftovers from the formation of the universe. It might also suggest that after the primordial age, they may not have consumed much matter at all (though we have plenty of evidence of some early supermassives consuming quite a lot at the fringes of the observable universe).
I still dont get it. If during the first few second there was so much matter in so little space that enormous blackholes should form, then why did big bang even happen? In the first instance, there was all the matter in the tinies of space, everything should have immediatelly form a blackhole and bigbang should have not happen
Disclaimer: I am a lay man and a physics enthusiast, no more. My thought is that the initial inflationary event of the Big Bang was far more powerful than early gravity’s ability to counteract it, maybe even on the order of the nuclear forces. I’d speculate that the “pits” (“low pressure systems” if we analogize to storms) would have formed in this period. Once expansion slowed, maybe even just a few seconds in, the conditions might have been conducive for the pits to collapse into black holes.
In fact, primordial blackholes are a candidate for dark matter hypothesis. They are still invisible but at least we could devise a theory to detect them (light focusing/bending for examples)
It's pretty cool how much JWST has caught in such a short amount of time. Give it another 5-10 years and we may have some absolutely groundbreaking new celestial theories.
It's really worthwhile to compare it to the hubble in that regard.
Pre-hubble, the question was whether the universe would contract into a big crunch, expand slower and slower forever, or balance out somehow in the middle. Then hubble got a good look at universal expansion and we noticed it was speeding up and now we've got Lambda-CDM from that.
From this we know what the long term fate of the universe is. That there are galaxies we can see that but we can never reach. That the universe will end from the slow burn out of heat death and not return to the singularity.
Just imagine what we'll know after 5-10 years of JWST.
Heat death is one of the possibilities, but it's still very much an open question, as it depends on the curvature of the universe and the nature of dark energy.
If conformal cyclic cosmology is to be believed, heat death is indistinguishable from the singularity as entropy is infinite in both cases, so heat death "may as well be" the singularity.
So this is one of them "shower thoughts", you know, those stupid things you think about while working on something else.That is, purely science fiction.
Can an analog be drawn between galaxies and solar system creation theories, with several orders of magnitudes in time required for formation due to scale difference. That is, the galaxies we see with their billions of stars are actually young proto accretion disks. while mature galaxies will have most of the mass localized in a small(1-2) number of points with a few(~10) point masses orbiting it. a sort of galaxy sized solar system if you will. However due to the scale of things involved all these late stage point masses would be black holes.
It is worth considering that a black hole's hold on its galaxy is not as strong as a sun-like star's on its accretion disk. Our Sun is 99% of the mass of the solar system, our Sagittarius A* is ~4 million solar masses, while our Milky Way is 800 billion solar masses.
On top of that, black hole growth rate slows to nearly a stop at ~270 billion solar masses, so even if a black hole manages to meet the rare conditions to get that heavy, it would still be dwarfed by its galaxy.
But at the same time, I think it's been shown the the size of a galaxy almost always reflects the size of its central black hole... that is, the influence of the central black hole seems to be way, way bigger than what you would expect from the black hole's gravity alone.
That doesn't follow. A galaxy is not like a star system that orbits a single massive object in a single plane. Objects orbit the center of mass of the galaxy, which is made of of an enormous mass of stars and clouds of dust and gas.
For example, the central molecular zone (CMZ) is an asymmetrical roughly spherical region, about 1600-1900 light years in diameter, that contains about 60 million solar masses of gas and dust alone, not counting the stars in that region.
That's 15 times heavier than the central black hole, or put another way, the black hole is less than 6% of its mass. By contrast, the Sun is more than 99% of the mass in the solar system - the planets are rounding errors by comparison. With the central black hole, it's the other way around.
And that CMZ gas is only about 5% of the gas in the galaxy. There's over 26,000 light years between us and the black hole, and every star and gas cloud in that enormous volume exerts a gravitational influence on us.
Zooming out a bit further, the galactic bulge is on the order of 15 billion solar masses, with a roughly spherical radius of about 6500 light years. The central black hole is less than 0.03% of that mass. If you want to imagine us and other stars in the spiral arms as orbiting a single central object, that bulge would be a better choice. But as I said, there's still another 23,000 light years between us and that bulge, filled with stars whose gravitational influence we feel.
If anything, causation is likely to work the other way around, in that larger galaxies have larger central black holes because they have denser central regions.
The questions that exist are more around how the central black hole affects galaxy formation, a sort of which came first question.
> If anything, causation is likely to work the other way around, in that larger galaxies have larger central black holes because they have denser central regions.
That contradicts the recent observations which show supermassive black holes were present as early as 700 million years after the Big Bang. That's just impossibly little time for supermassive black holes to have formed if they were just the result of matter in the central part of a galaxy to have "fallen" into the presumably smaller black hole you had in the beginning (presumably, because if you assume the black hole was already big to start with, then you must accept that the amount of matter in the center of the galaxy is a consequence of that black hole being there, not the other way around), which I believe is the theory you're proposing as more likely.
An excellent reply, far better than my idle musings warranted, however a point was missed. At some point in solar system formation the mass of the not quite yet a sun was only 6% of the mass of the entire system. At some very early point I assume the distribution of mass in the proto solar dust cloud would be the same as the distribution of mass in the galaxy.
I hesitate to reinforce my idea because really, it is ill informed musings on how solar system formation is coalescing gas and dust clouds and would galaxy sized objects eventually coalesce into the same proportion of objects. ill-informed because it turns out we can look into the past and see what newer galaxies look like.
> the mass of the not quite yet a sun was only 6% of the mass of the entire system.
That seems like an arbitrary division of the mass of the system. More than 99% of the mass of the system was going to become a star, so on what basis are you identifying a small subset of that as being significant? Gravity involves the net effect of the forces between every particle in a system.
> At some very early point I assume the distribution of mass in the proto solar dust cloud would be the same as the distribution of mass in the galaxy. [...] would galaxy sized objects eventually coalesce into the same proportion of objects.
No and no, because of the inverse square law. Galaxies don't look like large solar systems because they're much bigger. The Sun has a big effect on all the planets in the solar system, but it doesn't have much of an effect on Alpha or Proxima Centauri, even though that's our nearest neighbor.
it isn't. hawking radiation slows as the black hole size increases. I believe it's more just that it's very unlikely for too much mass to pass by a black hole
wow, i've assumed that if bh increases, ther's more space for howking radiation to escape so bh will loose more mass... I guess physics and intuition aren't going that well together in some cases)
Not an astronomer, but there's at least one important thing to consider:
Stars form in denser regions of gas. However once they grow large enough to start fusion, the solar wind will blow away the gas, halting growth of that star system. So stars have a kind of limit when it comes to size.
That's what makes these early universe large black holes problematic (read, exciting new science): we know black holes the mass of stars can form by a super nova. However the only way we know that these super massive black holes can form is by smaller black merging. There's simply not enough time for black holes as large as we think jwst has seen to form in the age of the universe we think we're looking at. Something in the physics has to give.
... the only way we know that these super massive black holes can form is by smaller black merging.
The main way supermassive black holes in the centers of galaxies grow is by accreting gas, not by mergers with other black holes. (But you can still have potential problems with not being able to accrete enough gas in a short enough time to explain early SMBHs, unless the initial "seed" BHs are larger than those formed by conventional supernovae.)
The vast majority of stars will end up as white dwarfs. Stars need to be about 10 times the mass of the sun to undergo core collapse, and about 25 solar masses (or more, depending on composition) to form black holes.
> So it stands to reason so would galaxies.
Galaxies are loosely bound clouds of dust with a few stars here and there, relatively speaking; stars are dense roiling balls of plasma. Their dynamics are extremely different.
> But then why would they be all over the early universe?
Stars can form quite quickly (relative to the age of the universe), possibly as early as 150 million years after the Big Bang. And the earliest massive stars would have had very short lifespans.
Can someone who really knows this stuff clarify my understanding?
I've been given to understand that black holes remain a physics theory that has not been directly observed or proven.
I thought that we had a fair amount of observational evidence that suggests their existence, but no "smoking gun" proof of it, and that these types of "observed a black hole" articles are a little misleading - that in reality, what was observed was much indirect, interpreted data that is consistent with black hole theory, but could also be something else.
The theoretical prediction of black holes dates back to 1916, when Karl Schwarzschild proposed a solution to Einstein's field equations that were part of his general theory of relativity. General relativity is arguably the most robust theory in physics - that being said, general relativity introduces some tension with the other most robust theory in physics, namely quantum mechanics and the Standard Model. These two theories are largely accepted to be accurate and correct, and since they predict different phenomena at different length scales, most scientists have no trouble subscribing to both.
With respect to "direct observation" of a black hole, astronomers at the Event Horizon Telescope successfully reconstructed pictures of a black hole using light in the visible spectrum. But to be fair, this comes after decades of overwhelming experimental evidence of black holes + GR. The EHT result probably received the media attention it did because it is easily interpretable as nice lil JPEG, rather than a 12-page paper with dense plots and LaTeX.
I’ve read something like: there are bunch of observations of extremely dense masses in various suroundings. Talented people can find explanations other than black holes in each case.
But that feels adhoc, and black hole fits quite nicely for whole class of observations. Current consensus is that b holes exist.
My read from looking at the literature is that the feeling is more like "black holes are a mathematical theory that have not been falsified by observational evidence, and for which we have some compelling, but inconclusive data, reconstructed from intensely noisy and poor resolution sources. Seems likely, but more data needed."
If you're sufficiently cynical, you can describe almost all astronomy results that way. Lots of alternate explanations for the observations have been proposed, and fail to match the full evidence.
Just how much evidence are you going to demand of phenomena hundreds of lightyears away before you say, "yeah, that's probably what it is"? Anyway, do be sure to read the whole list of observations linked above by csours, as there are quite a few.
See, this is my beef with the reporting on this stuff and the source of my question.
The "photo" you're referring to was constructed by a super computer massaging enormous amounts of data to fit a model and constructing an image from weighted averages of thousands of data images across the spectrum from an enormously complex array of detectors in an unfathomably noisy environment [1].
It was presented by the media as a "photo", but it seems to me like it's more similar to the "image" we get from reconstruction of wifi signals in a room to determine interior object's shape.
Or maybe a better example would be the "photos" of atoms, which are not photos at all but a reconstruction of miniscule electromagnetic resistance readings from a STM whisker driven by a piezoelectric crystal.
I think we're supposed to get the ability to have better "photos" with the activation of LISA in 2037 [2] but until then, claiming that we have photos of black holes seems a bit misleading (not by you, I mean the science reporting).
LISA is a scaled-up version of LIGO, and won't produce "photos" any more than LIGO does. Of course, LIGO has provided extremely good evidence that black holes exist (most of what it detects are the gravitational waves from two black holes merging), but if you don't buy that, then you won't buy LISA's observations, either.
> It was presented by the media as a "photo", but it seems to me like it's more similar to the "image" we get from reconstruction of wifi signals
Are you suggesting that emitting or reflecting visible radiation is required before we can prove something exists in objective reality? Your own point about Wifi signals proves you wrong - we can detect wifi in other wavelengths and thus learn a great deal about it. Just like we can with black holes.
There is no distinction between "images" and "photos" - they're both electromagnetic radiation converted to a visual medium. There's nothing unique about radiation in the 380 to 700 nanometers wavelength range that makes it somehow more scientifically valid for remote sensing.
Any digital or analog photo is a far cry from our biological signal processing.
Yes, their methodogy used lots of signal manipulation to produce the photo. Just like an RGB digital camera does. Just like analog film. Just like our rods, cones and visual cortex. Which one is the "correct" way to process the signal?
What you're claiming is that "photos" have privileged scientific status over other types of sensors/processing hardware. Someone else might claim that only biological sensors have privileged status. But neither claim is scientific since it cannot be verified.
In lieu of a testable hypothesis, we have to conclude that all frequencies of electromagnetic radiation and all mathematically-valid signal processing techniques are fair game for empirical observation.
Einstein didn't think black holes existed (1916) and it took until 1971 with Cygnus X-1 for serious consideration, which was the smoking gun. 55 years of being considered probably a math fluke and then observational evidence.
A part of CCC[1] is giant black holes outlasting the end of the universe to be around for the beginning of the next one. Is this a potential explanation?
Do they? One of the papers in support of CCC found patterns in the CMB that might have been black holes remaining from the last cycle. I wonder if they'd start eating the new matter created by the big bang and survive.
I wish stories like this were a more straight forward counting of the interesting facts and relevant input from experts. I don't really need the origin story of the conference this finding was discussed at.
I agree. I feel like they are taking "story" too literally now. Science news is soon approaching the level of superfluous meandering of online recipes. I'm not saying I don't care about the context, but a news article should be different to a short story. And, I still think there is a place for these things, but not for all articles, only some.
The stuff you're complaining about seems limited to a little bit of POV text in the first seven (!) sentences of a rather long article. Does it really bug you so much that they gave you just a tiny taste of the perspective of a woman who works on this stuff? That seems a bit much. This kind of text is everywhere in science journalism.
I likewise do not like the front loaded flavor text.
A good news article should be written in order of priority. Each word, each sentence should be written in order of decreasing priority starting with the headline. "JWST spots giant black holes all over the early universe" is a great headline, it encompasses the entire story. Especially at the beginning of the article, the purpose should be expand and clarify the most important pieces.
I really dislike the kind of journalism that goes into biography mode and starts off with an anecdote about Jane Einstein's childhood dog leaving the most relevant bits 2/3 the way through.
I'd like to be able to choose the level of detail I get from an article by being able to stop when I'm done.
I don't like having to dedicate several minutes to finding a piece of information that took several seconds to deliver. Much like the 10 minute youtube howto video hiding the five seconds of information I needed.
I utterly despise the current “long form” style where all long form stories start by talking about something old and unimportant.
Sure it’s a nice appetizer to whet the palette. But only if I already know I’m gonna read the whole thing! There’s too much long form for me to read. I want a tastier first bite to see if it’s worth my time.
The first spate of articles from this publication were like this, which really made it stand apart from the others. Somehow a complete inversion has taken place since then.
English text transformation is a task it’s exceptionally good at. You just need to prompt it to only use the source material you give it for summarization.
Guilty as charged! But to be fair, this applies to human writing too:
“Briefly stated, the Gell-Mann Amnesia effect is as follows. You open the newspaper to an article on some subject you know well. In Murray’s case, physics. In mine, show business. You read the article and see the journalist has absolutely no understanding of either the facts or the issues. Often, the article is so wrong it actually presents the story backward—reversing cause and effect. I call these the “wet streets cause rain” stories. Paper’s full of them.
In any case, you read with exasperation or amusement the multiple errors in a story, and then turn the page to national or international affairs, and read as if the rest of the newspaper was somehow more accurate about Palestine than the baloney you just read. You turn the page, and forget what you know.”
> Over the course of a year, Chandra stared at the cosmic lens for two weeks — one of its longest observation campaigns yet — and collected 19 X-ray photons coming from a galaxy called UHZ1, at a redshift of 10.1. Those 19 high-octane photons most likely came from a growing black hole that existed fewer than half a billion years after the Big Bang, making it by far the most distant X-ray source ever detected.
This blew my tiny mind.
19 photons over 2 weeks. That's such a tiny amount of light. How are we able to conclude anything from such tiny amounts of light? (I'm not disputing that we can, it just blows my mind that we can).
And those photons - 13+ billion years speeding through space, all that vast empty time and distance, to end up hitting a detector during those two weeks when doing so will change the way we think about the universe. And of course the photon itself hasn't experienced any of that time. For the photon, it was all the same moment.
Also blows my mind that any time I'm standing outside I'm being hit by photons that have been traveling for billions of years to reach that point. Of course they're overwhelmed by photons from the Sun, but still. Some photon left a sun billions of years ago, sped through the universe for all that time, and ended up hitting my retina. And I don't even realise when this happens.
I understand why theists don't like science. How can your puny god compare to shit like this?
You used to be able to tune a TV between channels and listen to the hiss of the cosmic microwave background. That’s light from millions of years before the first stars.
Digital TVs just don’t have that 2.73 degrees above absolute zero sound to them.
Now imagine that the Chandra telescope has a 1.2 meter diameter aperture that those 19 photons passed through.
That represents 1.13 square meters of light collecting area, peering into that expanding spherical wavefront at the point where it is 26+ billion light years in diameter.
That would be a sphere with a surface area of 2,123 square light years. I wonder what it would add up to if you had the 19 photons for all those other 1.13 meter chunks of area in one place.
I've wondered how blackholes have a similar 2d shape as with hurricanes and whirlpools where the center is empty. What if the inside of a blackhole is devoid of all things including spacetime, although that doesn't make good scifi.
I believe this actually is one of the hypotheses about black hole interiors, though I forget the name. Nothing to do with terrestrial storms or vortices, though; a black hole doesn't have to have any rotation at all.
Infalling matter usually is "whirlpooling", see "accretion disk". That's a separate question from the structure of the black hole itself and whether there's anything inside the event horizon.
> First, they let astronomers nail down the galaxy’s age. The infrared light JWST collects is reddened, or redshifted, meaning that as it traverses the cosmos, its wavelengths are stretched by the expansion of space. The extent of that redshift lets astronomers determine a galaxy’s distance, and therefore when it originally emitted its light.
Won't a photon climbing out of a huge gravity well have a huge redshift, thus confounding estimates of distance from us and estimated age?