Welcome! These forums will be deactivated by the end of this year. The conversation continues in a new morph over on Discord! Please join us there for a more active conversation and the occasional opportunity to ask developers questions directly! Go to the PS+ Discord Server.

QE reservoir capacity

17 posts / 0 new
Last post
Arenamontanus Arenamontanus's picture
QE reservoir capacity
The question came up in my game, so here is my estimate of QE reservoir capacities: A Low-Capacity Qubit Reservoir can send 10 hours of HD video, while a High-Capacity Qubit Reservoir can manage 100 hours. If we assume HD video requires a few megabits per second (well compressed); if we assume 8 megabit per second as the bit rate, then a low capacity container holds 36 gigabytes of qubits, and a high capacity holds 360 gigabytes. If we assume the video is more heavily compressed to one megabit per second, then the capacity is 4.5 gig and 45 gigs respectively. Audio signals typically have bitrates between 8 kbit/s (telephone quality) and 256 kbit/s (DAB). 100 hours of voice gives 360 megabytes to 11 gig for the low-capacity reservoir, and 3.6-110 gig for the high-capacity one. So a rough estimate is that the low capacity reservoir has maybe a few tens of gigabytes, and the high capacity a few hundred gigabytes. In regards to egocasting, my own estimate is that an ego is on the order of 100 terabytes. So if you get a thousand of the Expensive high capacity reservoirs (to a price tag of 20 millions or so) you can do an interstellar egocast.
Extropian
King Shere King Shere's picture
Re: QE reservoir capacity
Pardon a non-academic for nitpicking , but isn't 100 terrabytes, the entire brain data storage & thus contains much more data than its "mere" ego. I suggest EP technology can acquire the ego purely & doesn't need to transfer the entire brain where it resided. Perhaps large chunks of brain regions are unnecessary and can be replaced without loss for a "factory standard" in a recipient brain.
Tyrnis Tyrnis's picture
Re: QE reservoir capacity
It all depends on who you ask/what sources you read. Less than 10 TB seems to be one common estimate, but is it really a practical one? http://p9.hostingprod.com/@modha.org/blog/2007/11/faq_anatomy_of_a_corti... -- they run/ran a rat-scale cortical model in 8TB of memory with a whole lot of processing power. He extrapolates that out to humans with an estimate of needing a memory capacity of 3.2 PB and an even more ridiculous amount of processing power. Now granted, that's memory capacity and processing power needed to run the 'ego', not just storage of memories.
Smokeskin Smokeskin's picture
Re: QE reservoir capacity
Arenamontanus wrote:
In regards to egocasting, my own estimate is that an ego is on the order of 100 terabytes. So if you get a thousand of the Expensive high capacity reservoirs (to a price tag of 20 millions or so) you can do an interstellar egocast.
The bottom line is, you can do instant egocasting, it is just data. And the amount needed is probably plot-level expensive however you estimate the ego's size, so any estimate should do. I assume it is for "getting home" from a suicide gatecrashing mission? Will the gate be compromised or the gate ops be so pissed off they won't let you back in? Otherwise, some sort of local backup receiver that could return through the gate would be much, much cheaper.
Arenamontanus Arenamontanus's picture
Re: QE reservoir capacity
King Shere wrote:
Pardon a non-academic for nitpicking , but isn't 100 terrabytes, the entire brain data storage & thus contains much more data than its "mere" ego.
So what is your estimate, and how do you arrive at it? I reviewed various estimates in appendix A and made my own estimates on page 80 of this report (pdf).
Quote:
I suggest EP technology can acquire the ego purely & doesn't need to transfer the entire brain where it resided. Perhaps large chunks of brain regions are unnecessary and can be replaced without loss for a "factory standard" in a recipient brain.
Sure egos are compressed representations of a brain, but you still need to estimate what level of resolution the map is on. Leaving out 90% of the brain only buys you one order of magnitude. Compare to movies: If I told you that I had a general movie compression scheme that would produce a watchable 3 hour HD movie that only takes up one megabyte, would you believe it? Even if I told you that it didn't so much compress the original images as figure out how to reconstruct them, leaving plot and general appearance correct while fudging details?
Extropian
King Shere King Shere's picture
Re: QE reservoir capacity
Arenamontanus wrote:
So what is your estimate, and how do you arrive at it? I reviewed various estimates in appendix A and made my own estimates on page 80 of this report (pdf).
First .. I got the first impression that your estimate was intended for a whole brain. I have followed your report earlier, and do read topics concerning uploading. I'm fairly familiar with magnitude of bytes, and could compared it to the human neurons. 100 terrabytes is 10^12 bytes and that a human neurons are estimated at one hundred billion =10^11 neurons. A byte (256) is containing way less than a singular neuron. so adding a extra power of magnitude account for that. Thats my rough estimate concerning 100 terrabyte stores information equivalent to 10^11 neurons. Hans Morvec have also made a estimate on this topic.
Quote:
Hans Morvec, principal research scientist at the Robotics Institute of Carnegie Mellon University estimates the human brain’s processing power to be around 100 teraflops and have a memory capacity of 100 terabytes. full brain simulation.
My estimate for the size of a ego? I cant do any better, but still feel its lesser in size than the entire data storage of the brain. I attempted to acquire the figure instead using voodo upon the core book, . Page 269. where it says at a regular clinic it takes 10 minutes to upload (5 minutes for pod). But a 100 terabyte transfer have been transfered IRL, so its doable for EP tech, given 5-10 minutes. As for the movie example, Yes I would believe you. Its doable. 4 regular books can be stored within one megabyte. Thus the "compressed" file could contain a movie production script & instructions for a HDmovie production company.
Dry Observer Dry Observer's picture
Re: QE reservoir capacity
Ah, but you are all forgetting... I have an incredibly shallow personality. Really, there's not that much to transfer, certainly nothing you'd miss. One low-capacity qubit reservoir and, rest-assured, my vacuity can pursue you to the ends of the universe, one coffee-pot or toaster-oven's subprocessor at a time... =) Which, of course, leads to the Hypercorps' latest black project, agents with stripped-down minds and utterly absent emotions, who can be transferred anywhere, and who only need to store the data for their video recordings and text reports. So when you call my Men In Black interchangeable, soulless automatons -- not quite. But we're working on it. Thanks for the encouragement, though. Ah, progress... =)

-

root root's picture
Re: QE reservoir capacity
root@QE reservoir capacity [hr]
Arenamontanus wrote:
So what is your estimate, and how do you arrive at it? I reviewed various estimates in appendix A and made my own estimates on page 80 of this report (pdf).
King Shere wrote:
Quote:
Hans Morvec, principal research scientist at the Robotics Institute of Carnegie Mellon University estimates the human brain’s processing power to be around 100 teraflops and have a memory capacity of 100 terabytes. full brain simulation.
Hee, bring on the reference hammer! Earn your r-rep! Fight! I'm going to go make popcorn.
[ @-rep +1 | c-rep +1 | g-rep +1 | r-rep +1 ]
babayaga babayaga's picture
Re: QE reservoir capacity
King Shere wrote:
First .. I got the first impression that your estimate was intended for a whole brain. I have followed your report earlier, and do read topics concerning uploading. I'm fairly familiar with magnitude of bytes, and could compared it to the human neurons. 100 terrabytes is 10^12 bytes and that a human neurons are estimated at one hundred billion =10^11 neurons. A byte (256) is containing way less than a singular neuron. so adding a extra power of magnitude account for that. Thats my rough estimate concerning 100 terrabyte stores information equivalent to 10^11 neurons.
100 terabytes is 10^14 bytes. The human brain has of the order of 10^11 neurons. But the catch is that -- as far as we understand it -- the "brain map" is given by the *connections* between the neurons, the synapses, which are of the order of 10^15 (in small kids, closer to 10^14 in adults), so a few thousand per neuron. The key value you have to record is the strength of each synapse of each neuron, for which a byte is probably sufficient (and two or three certainly are), and the endpoint neuron, for which you'll need about 4-6 bytes. So you'll need about 10^15 to 10^16 bytes, assuming: a) no additional information about other stuff (say, various chemical stuff altering the synapse dynamics); I'd say this is unlikely to add a full order of magnitude. b) no compression/compressibility; this may drop the total by an order of magnitude (after all, the cortex is little more than one tenth of the total neurons in the human brain), but I'd say it's unlikely to drop it more than two. So, something between 10^13 and 10^17 bytes. If you have that, simulating a brain is easy: each neuron's level of excitation is updated -- let's say every millisecond or tenth of a millisecond -- based on the level of excitation of the neurons sending a synapse to it, and the strength of the relative synapses.
King Shere King Shere's picture
Re: QE reservoir capacity
babayaga wrote:
100 terabytes is 10^14 bytes.
Oops. Yes I suppose I failed to add the 100 into the (1 terra) 10^12.
Arenamontanus Arenamontanus's picture
Re: QE reservoir capacity
A lot of our estimates work by counting the number of some kind of thing in the brain, then multiplying that with an assumed information content. Usually people tend to think it is obvious or natural what that kind of thing is - synapses is the most common example. Myself I think it is cellular compartments (synapses plus small sections of neurons). But we do not have a firm reason to settle for a particular resolution yet. It has to be individual (two egos are different from each other), so it has to be a finer resolution than the overall connectivity area to area in the brain. Among people I have asked the rough view is somewhere on the synaptic/compartment level, but there are concerns that we might need to include some biochemistry. Some pessimists think we need a lot of biochemistry, making the number of simulated things enormously higher. And of course Hameroff et al will claim we need to include quantum stuff (at which point scanning properly becomes nearly impossible). And it might be that the initially complex simulation can be simplified in a lot of ways once we have an upload running - maybe we just need 16 synaptic strength levels, or the cerebellum can be run as more efficient code than a neural network. Still, I am very suspicious of any estimate getting below the terabyte level. The bigger computational demands are running the emulation in realtime. Memory is easy compared to getting the TFLOPS for a brain, even at fairly low resolution.
Extropian
King Shere King Shere's picture
Re: QE reservoir capacity
If it was available for the task; what level of detail would the current champion of these supercomputers perform? top 500 supercomputers. The top1 was last time I checked a10.51 petaflop supercomputer
root root's picture
Re: QE reservoir capacity
root@Brain bit count [hr] Something I've wondered about is that in addition to synaptic strength weights, transmission speed is an important factor. There is the rough breakdown between grey and white matter, depending on mylenation, but there is also the raw length of the neuron and the number of layers any given signal needs to propagate down. So while you might be able to fit a Neural Network representation of a brain into a few terabytes, the connection graph is significantly more complicated. And that's even ignoring the weirdness caused by parasitic capacitance and other electrical/magnetic oddities caused by interference. Meat computers, I feel, are intrinsically tied to their structure, not just their architecture.
[ @-rep +1 | c-rep +1 | g-rep +1 | r-rep +1 ]
Arenamontanus Arenamontanus's picture
Re: QE reservoir capacity
King Shere wrote:
If it was available for the task; what level of detail would the current champion of these supercomputers perform? top 500 supercomputers. The top1 was last time I checked a10.51 petaflop supercomputer
That is 10^16 flops, and by my old estimates that could run a human-size analog network population model - groups of neurons replaced with ANN nodes, 10^8 such minicolumn units with 10^13 connections. It would take 50 Terabytes of memory storage, which is eminently doable today. I expect that we can run a spiking neural network version before the end of the decade. Here each neuron would be in the simulation, but they would all be treated as simple units (sum weighted inputs, fire if sum big enough) rather than having the full biological complexity. The big problem today is the lack of brain scanner devices with the right resolution and volume - we do not know the connectome well enough to run even the analog network model. Root: yes, delays likely matter a bit. They are not too hard to model, essentially just an extra variable per connection. But there are plenty of other potential meat properties that we must investigate - from capacitances to electromagnetic interactions to modulation from the glia cells. The neat thing with biology is that it tends to be fairly robust: once you get close to what is needed things tend to self-organize enough to actually work.
Extropian
Dry Observer Dry Observer's picture
Re: QE reservoir capacity
One serious issue with a "successful uploading" which copies a fully functional mind and memories is that in getting something key *just*slightly* wrong, you run the risk of having a seriously deranged mind running on a platform capable of outpacing any conventional biological mind. Normally I don't worry as much about a flawed but fully sentient supercomputer pestering us -- I think the odds of getting things completely wrong and ending up with a mind that is effectively non-functional or dead are much higher than creating an unstable or aggressive sentient upload. Still, it's an issue.

-

root root's picture
Re: QE reservoir capacity
root@QE reservoir capacity [hr] Dry Observer, have you seen any videos or pictures of a brain surgery? I wish I was kidding when I say that it looks a whole lot like taking ice cream scoops to a brain. They have to check the area first to make sure the patient doesn't lose words, but it's amazing how much of a brain can be lost with little or no side effects. The phenomena of having a bit or two out of place and losing everything is related to how our digital computers handle data, which is combated with parallelism and redundancy. Every neuron in a brain should be thought of as a complete, if crappy computing device (technically you need at least two layers in an ANN before it's Turing complete, but whatever), where parallelism and redundancy is the name of the game. Now, there may be a very few structures that [i]must[/i] be constructed as an exact replica, but if that's the case then there are few enough of those structures that the extra overhead of sending a thousand copies of each is negligible.
[ @-rep +1 | c-rep +1 | g-rep +1 | r-rep +1 ]
Xagroth Xagroth's picture
Re: QE reservoir capacity
I think it took about one hour to make a backup of the ego (or just pop out the cortical stack...), considering the speed of the stuff in EP (5 mins to make a brainscan without much detail to check identity!), I'd say they take a lot of views from the patient's head, making the error chance something minute.