It's very useful for b2b people to know who is looking at your website and who might be good targets to reach out to. Knowing that Steve the SVP at Databricks is looking at your website vs cookie 1234 is a pretty key differentiator
From first-hand experience, B2B sellers do not get this information today. They typically get "signals" like "Company X is searching for product category Y" or "Company X is visiting your web site". And "Company X's CIO is John Smith - here's his phone number". But nobody that I've seen claims to offer individual names of web site visitors.
Services like retention.com provide email addresses for otherwise anonymous site visitors based on information provided to them by their "publisher network".
There are a lot of companies selling exactly this service. In some cases, they're able to do it pretty easily, because most people don't actually make any effort whatsoever to remain anonymous. And in a B2B setting, there's a good chance your own company is feeding their data into the same pool.
Some companies are trying to sell you on "AI based" solutions that take deanonymization a step further, but as far as I can tell, it's mostly horseshit and wishful thinking.
What I mean is that they provide enough information to pair your first-party data with "anonymous" web traffic matched to the account level with enough granularity to line up the two.
That said, most of the companies in this space massively over promise and under deliver. The one thing they all have in common is that they're good at ABM, so they can sell to your CMO/CRO faster than you can call them out on their bullshit.
The question is "how long has it been since the big bang." It's an important and relevant question for cosmology and physics. It isn't really a stance on the "beginning of time," which may have started long before this moment, but it is the start of the universe as far as physics is concerned.
How I work it out in my head that time effectively did not exist before the big bang is that; If everyone agrees that time slows as the gravity increases, and we assume at the time of the big bang that all the mass of the universe was in an infinitely small space, the conclusion is we had an infinitely large gravity and time would be effectively be stopped. Take it with a grain of salt.
Exactly. It could even be turtles all the way down, with new building blocks of physics becoming relevant as we go smaller and smaller (and back in time).
I like this. But you still have a Prime Mover problem. If time effectively halts at T0, then what possible event could occur (outside of time?) to nudge that infinitely large glob off mass into the motion we observe today?
T0 includes the prime movement in its definition. If the universe never emerges, change never exists, so it’s not anything (including not T0). Conversely if we know it was T0, it’s because we know things started happening after that.
The universe advanced from T0 because it had to, by definition.
If this seems like a weird cop-out, well, that’s a singularity for you.
The glob of mass starts off in motion at T0, there is no time prior to T0 in which a glob of mass exists waiting to be nudged.
Think of it like a particle decaying - at some point it just happens with no trigger or internal mechanism, and now you suddenly have the decay products wizzing about with some energy and momentum that they just start with.
This is where we see the limits of science. It's a great tool and a lot of benefit came from it. But at some point we encounter things beyond its domain. Us people of faith know that God is the cause of existence. In Islamic Kalam we have a phrase (Wajeb Al-Wujood) - meaning The One who's existence is obligatory/fundamental - i.e. God.
> If time effectively halts at T0, then what possible event could occur (outside of time?) to nudge that infinitely large glob off mass into the motion we observe
today?
Anthropic principle explains it. If you need a Prime Mover, you can also replace it with Anthropic principle and random chance.
For example if time halts at T0, physics breaks. Using limes when t->0 you get that v-> Infinity (or c). Slower time moves for you, faster you move in space. At that point, if nothing is there to detect it, every massless particle will be effectively everywhere, all at once. Essentially photons can have Infinite energy and just cause a random Big Bang. This is CCC like explanation.
Another answer is Hawking's North of North Pole time problem (from Universe in a Nutshell). If time halts a T0, you can't ask what's before it. It's just logical impossibility. Like what is North of North Pole? Answer is: Question is invalid. Anthropic principle implies there are however many multiple such configuration so you get a parallel or sequential universe solution.
We arise and wonder how exactly Universe exists that support life, even if we can clearly see, it's only supporting life at this moment. Our existence is a grain in the sand of a universe-sized sand clock that will be the Black Hole epoch.
No, I am not. I've demonstrated my work, you can prove me wrong using an argument rather than "you're wrong". Universe doesn't have to have a why. And science isn't dealing with such metaphysical questions. That's realm of philosophy.
The principle of sufficient reason states that everything must have a reason or a cause. This is not a principle to lazily toss aside to avoid some more difficult question, it is a foundational idea.
Our understanding of the world (aka science) is largely based on causality, and if things existed without any cause or reason, our understanding of "science" would probably be very different.
(I'm not a physicist, but this is how I understand things.)
I mean... that's a pretty superficial reading of the situation. Where do you stop? If you assume Newtonian time, which extends back to minus infinity... well, what caused time to exist at all?
Also, no one is "lazily tossing it aside". It comes out of the mathematics that describe the observations:
Say you find that all observations you make are perfectly described by dx/dt = 1/x with current time t1. If you follow the trajectories backwards, you find that the trajectory can not be extended back past some initial time t0 at which x(t0) = 0 as the equation becomes singular then. You are now at time t1-t0 from the initial singularity. That is the age of the universe since the big bang. Now the trajectory as it approaches t0 has some unusual properties, it moves infinitely fast, etc... These might lead you to postulate that unknown physics will actually invalidate your law as you approach t0. But there is nothing logically or epistemically _wrong_ with the law you have. The finiteness of time flows out of a causal empirical induction argument. It is not introduced ad hoc to avoid some difficulty, it just is how we find nature to be at the most conservative interpretation of the evidence.
In that case, we are not disposing of any principles (as you originally claimed) when we have an uncaused first cause at t = -13.6 Billion years, instead of having at t = - infinity.
I'm not sure "accepted" is the correct word. It is a feature of our current theory, but we know that theory is incomplete. In particular high density regions such as the very early Universe are where we know ate theories start to break down, as the conflicts between quantum physics and gravity become relevent.
There are proposals for time stretching infinitely back, but we have almost no way of testing them.
If time is stopped then why did it bang. The fundamental character of time is ability to change. If time is stopped then there should be no change. Otherwise time wasn't stopped.
Maybe I guess. Many models stipulate that time began with the Big Bang. Many propose the Big Bang was a local event that obliterated our ability to observe time before it. We have models where the universe rips apart, collapses, or just evolves forever and always has and always will. I think what’s crucial to understand is we have a lot of different possible explanations for what we see, some of them discuss beginnings and ends, some do not. Perhaps as a mathematician with a relatively closed set of possibilities for explanations that’s unsettling. But, I’ve always found the various paradoxes in math to illustrate similar problems in formulating a closed and coherent anything, including the universe.
One thing I have always wondered, since gravity is proportional to the mass of the two objects and inversely proportional to the square of the distance between them, if the universe was smaller with the same mass, wouldn’t gravity have been more “dense” in an earlier universe?
And since we know that gravity affects the rate of flow of time, wouldn’t the rate of time be enormously distorted earlier universe?
I’m not trained in any of this, so hopefully there are greater minds here who can help me understand
> inversely proportional to the square of the distance between them
As I understand it, that’s an approximation for Euclidean space because the area of a sphere is also proportional to the square of the radius in such a space, but it’s not true of non-Euclidean spaces like in GR because the area-radius relation is different.
IIRC, the cosmic microwave background has a gamma factor of about 1100, so the area of that shell is the same as one 1100 times closer or 1/1100^2 times the area as a Euclidean sphere with that radius.
> And since we know that gravity affects the rate of flow of time, wouldn’t the rate of time be enormously distorted earlier universe?
Time did indeed slow down then compared to now, although it’s not entirely obvious to me that this has any physical interpretation when it happens “everywhere”: https://youtu.be/66V4RSmDqYM
My understanding is that time was not slower (whatever that would mean), only that the expansion of the universe means that light is stretched. So events in the early universe appear to happen 5 times slower.
We know that the force carrying particles of all forces have a frequency, just like any other particle. That means that if particles on average move faster than, say, double that frequency, they can't exist.
So there must have been a time when electromagnetism, the weak and even the strong force just didn't exist. They couldn't. So particles would just have totally ignored those forces.
We don't know if gravity is the same, but ... why wouldn't it. Though of course according to relativity gravity just wouldn't care, but that just raises a lot more questions than it answers.
I don't think there's a good answer to this question, at least not when it comes to the nature of particles and light, because we don't have a good answer to what movement and time are. We already know both light and particles are you moving relative to fields. Light is you moving relative to an electromagnetic field. If you move towards a magnetic field you would find it starts "glowing", in fact, that is what light is. But it's not like a magnetic field reacts to you because you start moving.
The issue is that movement and time are fundamental to the universe, yes, in the way relativity describes but also in a totally different unknown way. In some ways particles are "just" things moving relative to one another. Which, for one thing, brings the perspective question (if you accelerated to a "real" speed, would you see a different universe? Because you would disagree with us slowpokes here on earth on what particles exist at least in some cases. But would you see an entirely different universe?)
> It isn't really a stance on the "beginning of time," which may have started long before
Well... yes it is, in the rigorous sense of "time" defined by general relativity. There's no "before" for a singularity. It may not be the whole story, but whatever metaphysical notion defines the "before/beyond/outside/why" that drives the big bang, it's not a place on the "time" axis of spacetime.
Specifically, GR is a model that breaks down at singularities. That time "begins" at the Big Bang is a prediction of GR, but until we have a model of quantum gravity there's no telling whether that's actually true or whether the conditions at the big bang are something GR can't fully describe.
Similar to the singularities in black holes - everything up to a stone's throw of the event horizon is pretty well explained by GR, but as far as the horizon itself or the region beyond are concerned, there might be dragons as far as we know.
> That time "begins" at the Big Bang is a prediction of GR
I don’t think that’s right. If we interpret Big Bang theory as claiming that there is a singularity at a finite distance into the past history of every present event, then GR can’t predict what happened at or prior to that singularity. Whether time “began” then or whether there was “more time on the other side” is a question GR alone cannot answer, not a prediction of GR
Black hole singularities do not start right after their event horizon. The event horizon only demarcates where the black hole singularity becomes an inevitable (inescapable) point in all possible futures.
That's not what I was trying to imply, sorry. It's the singularity at the center where GR entirely breaks down, but there's also weird stuff going on below the event horizon (space becoming time-like and vice versa), that aren't present in, e.g. String Theory's Fuzzballs [1] (which, of course, bring their own set of thorough weirdness). So what I was trying to say was that while GR predicts some behavior below the event horizon, a full model of quantum gravity could predict something entirely different, and not only for the area just around the singularity itself but (maybe) up to the event horizon.
How does that work for black holes? It seems like there would be a 'before' they formed in the time dimension of our universe, if not within the singularity itself.
For blackholes it's the reverse, all paths lead to the singularity, and there is no 'after' as opposed to the big bang where all paths lead away from the singularity and there is no 'before.' If you hit rewind on a video of matter falling into a black hole's singularity, it would look like a big bang where everything was created from nothing at an infinitely dense point and starts flying outwards.
Think of singularities as unidirectional. We don't understand what if anything was before the big bang, there is no return from inside a black hole event horizon, we don't understand what would follow after an AI singularity. That doesn't mean that they don't have a threshold in time/space/spacetime, just that crossing that threshold breaks the rules we know.
Singularities are mathematical constructs and are used to model different kinds of phenomena. Black holes and the big bang are only roughly comparable (but by no means similar) if you are considering a black hole from "inside" of one.
There might literally not be enough time to expand beyond that, given how cosmological horizons work. Being part of the system we're trying to observe puts some nasty limits on what we can know, even in principle.
I agree we practically may never know and fundamentally the rules of the universe might end up making it impossible to know somethings, but as far as I know the ultimate limits are currently unknown so 'We can hope ...'.
There’s an alternative take that says that, as the universe contracts, it eventually hits a point where other effects (quantum, but also possibly unknown effects) predominate and that this kicks off another stage of expansion.
Which is to say that there exist respected papers that outline this scenario in great detail, but there’s precious little observable evidence of a previous universal cycle.
Every solution in the space is bad and it's a wonder that any python software is maintainable and I think the only real answer is everyone gets fucked all the time with these bugs and rough edges
I say this as someone who has done an unfortunately large amount of work in the python packaging and python package distribution space and still get bitten every few months
My personal experience is that the bulk of the time these rewrites don't end up delivering the performance increases that people expect because they don't really understand what's making the prior system slow and don't develop good enough requirements
This is not saying you shouldn't, it doesn't really matter to me. But I think a lot of the time this is driven by engineers who feel like they need to do this rather than a financial decision to save money
My experience is along the following lines (same for Django as well actually):
1. Company started as Rails monolith
2. Some components of the Rails monolith stretch Ruby capabilities beyond where "throw more servers at it" is still reasonable or effective
3. Factor out some backend functionality into dedicated services which can be scaled separately, Rails still serves as the API gateway calling these services. Anything without scaling issues stays in the monolith for now.
4. Eventually Rails is just an API gateway, no one in the org knows Ruby/Rails and its dependency management madness any more, and it gets replaced with a more-performant, purpose-built API gateway, usually something off-the-shelf.
You don't wanna get hacked but basically everyone gets hacked, so it's more of a question of "how well does your security and monitoring stand up to hacking?"
The big red flag here is that they didn't catch it for so long! How did they not notice?
Meta recently announced a $100 price increase. So they are able to increase prices without lowered supply. It's not entirely clear why but fits the pattern of popular -> price goes up.