Welcome! These forums will be deactivated by the end of this year. The conversation continues in a new morph over on Discord! Please join us there for a more active conversation and the occasional opportunity to ask developers questions directly! Go to the PS+ Discord Server.

File Sizes and Bandwidth

9 posts / 0 new
Last post
Trappedinwikipedia Trappedinwikipedia's picture
File Sizes and Bandwidth
I'm currently running a campaign where the players are spending a great deal of time through the pandora gates, which means that I've actually got to think about bandwidth limits and file sizes. Primarily I'm trying to figure out how many blueprints can be sent through a QE comm, and what the appropriate pricing for using them might be, as the party has wanted to use some 3rd party ones in the past. So far I've got a starting idea for how much bandwidth a large QE reservoir might have in total. It should be somewhere between 12 TB (modern internet video) and 10 PB (uncompressed 16k video at really high FPS, which is probably the largest video format in common use) depending on how good the image quality they're talking about is. This is simply to get a lower and upper bound, and is for 100 hours of each. Assuming some kind of middle ground, like compressed 8k video is pretty much standard by EP norms then 100 hours should be around 288 TB based on [url=http://en.wikipedia.org/wiki/High_Efficiency_Video_Coding#Tiers_and_leve.... This seems like a decent middle ground, both for usability and to preserve game balance a certain amount. Assuming an ego file is around 1 petabyte, which is supposed to be fairly reasonable according to people who know more about that kind of thing than I do, it'd take around 100k credits worth of Qbits to send an ego, assuming all the infrastructure is already owned and in place. Where I'm less sure of things is how large blueprint files should be, I'm partially wondering from a simulationist point of view, but I'm also curious about what people think would be a good number for game flow. I'm currently thinking that X TB per cost category would be a good generic rule, with exceptions for very complicated things such as morphs, and very simple things, such as single-material goods which just need shape instructions, such as cheap knives. I'm thinking X should be between 50 and 100 right now, which would either place [high] or [expensive] cost blueprints above the capacity of a large QE reservoir. Simple and complex blueprints go up or down one category. I don't know if this is realistic, but I'm not terribly concerned with that. Does anyone have any solutions they used to arbitrate when bandwidth was a limitation? Does anyone have an idea of how large blueprint files would probably be? How much should blueprint call-ins be something that gatecrashing teams have access to? Thanks for any help.
Lazarus Lazarus's picture
I wouldn't worry too much
I wouldn't worry too much about bandwidth. The bigger issue is probably the maximum amount of data that can be sent before the group needs another reservoir and that's probably going to depend an awful lot on how big those blueprints are. You can figure the size of a file is roughly related to the size of the object that it produces, but this won't be entirely true. Something that can be produced at a 'lower resolution' like a bowling ball wouldn't take nearly as large a file as a Spare even though both objects are roughly the same size. I would leave it as GM fiat rather than trying to establish a lot of rules because trying to establish a lot of rules is a lot of work and you'll probably find edge cases that break those rules and which require GM fiat anyway. The closest I would come to giving a ruling is that your average blueprint uses 'a lot'. There will undoubtedly be exceptions (such as the bowling ball) but in most cases you are talking about an awful lot of data so that a nanofabricator can produce a physical working object. Not just whether an individual tiny 'voxel' is filled but what it is filled with, whether it is bonded to each of the six 'voxels' around it, etc. There's a reason why your average gate crashing teams don't use QE communications to get blueprints. Does it mean your team can't? Of course not, but they're going to pay for it.
My artificially intelligent spaceship is psychic. Your argument it invalid.
ShadowDragon8685 ShadowDragon8685's picture
When it comes to file sizes,
When it comes to file sizes, I suggest you follow the example of most sci-fi: either avoid mentioning file sizes and storage/memory capacities, or use made-up units. This prevents your stated values from looking hilariously stupid when and if real technology catches up; or if it conspicuously fails to do so. (Pretty much any time NCIS deals with specifics of computers is an example of this.) I would also suggest against basing the bandwidth costs of blueprints on the cost category of the object. There are too many expensive items which are small and too many cheap ones that are gigantic.
Skype and AIM names: Exactly the same as my forum name. [url=http://tinyurl.com/mfcapss]My EP Character Questionnaire[/url] [url=http://tinyurl.com/lbpsb93]Thread for my Questionnaire[/url] [url=http://tinyurl.com/obu5adp]The Five Orange Pips[/url]
ORCACommander ORCACommander's picture
quite frankly BP file sizes
quite frankly BP file sizes are going to be negligible. think of the size of modern auto cad and 3dsmax files and compiled firmware, source code is a bit more bloated.
SquireNed SquireNed's picture
ORCACommander wrote:quite
ORCACommander wrote:
quite frankly BP file sizes are going to be negligible. think of the size of modern auto cad and 3dsmax files and compiled firmware, source code is a bit more bloated.
I'm not so sure they'd be "negligible"; most of the things I've seen CAD files for are simple things, but blueprints include a lot of different things in EP than they do today; you have all the materials data, and you include higher resolution parts. Of course, bandwidth and storage in EP are so massively advanced that even blueprints don't take up all that much time. Filesize is more dependent on the things you need to tell the fabber, rather than the other way around. Each material, piece of detail, and component adds some data, and you will be shipping firmware with almost anything (sans maybe old-school furniture or housewares, or rajput weaponry). Making a very complex object out of steel may result in a tiny format, especially if you can send it in a vector format or fractal pattern, but making a simple object out of a hundred different materials that aren't standardized could be nigh impossible.
Lazarus Lazarus's picture
ORCACommander wrote:quite
ORCACommander wrote:
quite frankly BP file sizes are going to be negligible. think of the size of modern auto cad and 3dsmax files and compiled firmware, source code is a bit more bloated.
Modern CAD files to build an entire automobile are considerably larger than what you are thinking. The 3DS max file for a car isn't all that large but that's because it is relatively low resolution. If you 3D print a lot of those smaller files you will find that the output is fairly faceted (this occurs because the rendering engine often uses various shading models to produce an image that renders nicely, but such techniques cannot necessarily be applied to actual manufacturing). Additionally those models are usually missing the vast majority of their components. A hood is relatively cheap, memory wise, all those engine components are going to drive it way up. Every single nut, bolt, and spring, every wire, every latch, every bit of polyfill in the seats and every bit of texture on the dash needs to be defined in the blueprints. Not only that but you have to specify all of those bits with engineering tolerances. You can't just say '30mm long #8 screw'. You have to specify how tight that screw is turned. Now yes, some of this can be handled in a file format so you don't have to declare all the data for every single screw every single time. You can say 'this is what I mean by a #8 screw' and then every time that screw is used reference those settings. That's why something like a bowling ball might not take that big a file (here's the data for the various materials, this volume is filled with this material) but for anything with a reasonable amount of complexity I think you'll be looking at a bigger file than you expect.
My artificially intelligent spaceship is psychic. Your argument it invalid.
R.O.S.S.-128 R.O.S.S.-128's picture
Pure Fiat
Unfortunately, the answer is entirely "it depends", which means really all you can do is make up whatever's convenient. It would depend not only on how complex the object you're making is, but also on how complex the process you're using to make it is and what level the instructions operate at. For example, it could describe a precise arrangement of atoms (or voxels) in three-dimensional space with precise instructions on how the nanobots should place each one, or it could instead simply describe a set of 3-D objects with simple material labels, followed by "insert tab A into slot A" assembly instructions for the fabber's on-board firmware (or you) to interpret. The latter would be a much smaller file size than the former, because it offloads much of the burden of detail onto the fabber (similar to how the recipe for a cake is much shorter than its chemical analysis). Basically a blueprint could be as big as an electron-resolution voxel model, or as small as a text file that defines a set of primitives, dimensions, and materials. Though different formats will likely also be more suitable for different types of objects. The text file, for example, might only be practical for simple objects that can be made from a few primitives with few materials, such as a bowling ball or traffic cone. Of course, conveniently Eclipse Phase doesn't simulate the whole thing with different fabbers that have different tools/capabilities (or even dimensions) might require different instructions. Best not to open that can of worms. So, what does it *really* depend on in game terms? How much you as a GM want to squeeze their reliance on QE comms.
End of line.
Trappedinwikipedia Trappedinwikipedia's picture
I'd also expect blueprints to
I'd also expect blueprints to come with the firmware or software needed to run whatever they are, which in a setting where yottabyte files are only considered large, I'd expect code bloat to be a big thing. I think what I'm leaning towards using is "generic" blueprints being really large, and able to be interpreted by any unlocked CM, (those many TB monsters) while most manufacturers will provide a set of terms for their own gear packages, which could get as restrictive as being limited to specific brand CMs with their own internal machining. That should give the players some leeway for calling for new blueprints as situations arise, while avoiding the bookkeeping mess that is tracking individual blueprints. They'll probably be a few major open-source materials libraries, probably Argonaut controlled, which should cover the basics pretty well, though finding exactly what you want can be tricky. For more specialized, rugged, or tolerant gear I'll probably work up a shortlist of manufacturers, which are probably split across a few state-mandated library standards from Titan, and a more eclectic set of standards from LLA, PC, MC, and Extropian hypercorps. Blueprints interfacing with a library would be a lot smaller, probably a small enough amount that I won't keep track of it. I'll probably have QE comm capacity reduction from whole sets of gear, rather than individual pieces. They're a fairly public "adventuring" crasher team, so I'm partially using this for the possibility of sponsors using them as a way to advertise gear through crasher endorsements and use.
MrWigggles MrWigggles's picture
Well, what about Fabber
Well, what about Fabber having stock libraries of common materiel and parts? Not everything in EP is custom everything, there are assumptions they can work against.