Hardware issues in the past week
- new workstation parts arrive
- swap Titan from current home to new workstation
- now the Titan won’t work. Why?
- because Ubuntu updated while we slept, breaking driver support
- fight to re-configure new workstation
- replacement for the faulty card arrives
- appears to be faulty itself
- swap it for the card in my workstation for testing
- yeah, definitely faulty
- oh shit that’s really hot
- did the fan even come on?
- [zztzt] Ow! (Being burned, as a theorist, is pretty novel)
- this card is cursed
- Let’s see if we can get the Titan working
- nvidia releases new drivers
- I set myself on fire
Where running a different version of the OS than what was installed (two days ago), breaking my tenuous driver support, and halting my work in it’s tracks, making my adviser come in to work on Father’s Day…
… Is a feature, not a bug.
The Last Of Us
I’ve changed my mind; I don’t want the zombie apocalypse to happen.
I suppose it’s time to start packing.
It’s been so long since I’ve had legit liquor that my drink triggered my gag reflex.
Q:Is there any aspect of describing nonlinear dynamic systems that quantum computers would likely be helpful for?
I’m not an expert in quantum computing, so I encourage anyone who is to chime in.
My expectation for quantum computing is that since they’re so difficult to build and sustain, algorithmic development for the ‘architecture’ is almost nonexistent.
The main idea of a quantum computer is that you can determine every answer to a question simultaneously. In this sense, quantum computers are only good for doing one thing: simulating quantum systems.
Since dynamical systems are canonically nonlinear, and small scale (as one might feasibly simulate) quantum dynamics are effectively linear field theories, I can’t say I’m aware of many applications. However, it’s such a radically different computing model that many problems that don’t map well to current methods will find a home in quantum computing.
What computing models would work well for dynamical systems? GPUs, obviously. Simulating small numbers of ODEs over large fields is a perfect application for GPUs, and simulation of large extended problems (PDEs) is none-too-shabby either.
Guys, this might be just a tiny bit premature but
I MIGHT BE GETTING OUT OF THIS SHIT HOLE A MONTH EARLY I AM SO EXCITED I MIGHT SCREAM