Hacker News new | past | comments | ask | show | jobs | submit login

I didn't realize how integral Monte Carlo sims were to our early advances in nuclear technology. It makes sense to me though- it seems like the Monte Carlo method lets you punch above your weight class in terms of measuring and predicting phenomena that are too complex, or too expensive to deterministically model.

My intuition tells me that it's effectiveness would fall off as the complexity of the in/out relationship scales. Is this true? Or can sufficient sample density overcome arbitrary levels of that type of complexity?




> the Monte Carlo method lets you punch above your weight class in terms of measuring and predicting phenomena that are too complex, or too expensive to deterministically model.

Yes, this is exactly why I like it. At AWS, we've used Monte Carlo simulations quite extensively to model the behavior of complex distributed systems and distributed databases. These are typically systems with complex interactions between many components, each linked by a network with complex behavior of its own. Latency and response time distributions are typically multi-modal, and hard to deal with analytically.

One direction I'm particularly excited by in this niche is converging simulation tools and model checking tools. For example, we could have a tool like P use the same specification for exhaustive model checking, fuzzing invariants, and doing MC (and MCMC) to produce statistical models of things like latency and availability.


A while ago I wrote this as a simple introduction to applying MC methods in distributed systems: https://brooker.co.za/blog/2022/04/11/simulation.html


This sounds really interesting - do you have any recommendations for further reading?


It is hard to generalise about the suitability of Monte Carlo methods. In practical applications it is almost always used in hybrid systems, combined with analytical methods and problem specific short cuts. How one should apply Monte Carlo methods to a problem tends to be an open ended question.


You should be able to model "infinite" complexity, the way people design analog circuits is basically this.


Sorry for stupid question, what is infinite complexity in analog circuit ? Any examples/model ?


I think it's an ad-hoc term. But basically, MC simulations are needed because circuit elements such as transistors, resistors may be mismatched, for example Vth in MOS transistors. This can create input offsets in op-amps, or timing differences in logic. You can run exactly the same simulations as normally (DC operating point, AC transfer function, time-domain with many thousands of points) just over and over with new random parameters according to the distribution estimated in device characterization (I think mostly Gaussian). Transistors models like BSIM are crazy complicated these days and there's no way to find an analytical solution for all that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: