Welcome! These forums will be deactivated by the end of this year. The conversation continues in a new morph over on Discord! Please join us there for a more active conversation and the occasional opportunity to ask developers questions directly! Go to the PS+ Discord Server.

What year was the Fall, do you reckon?

20 posts / 0 new
Last post
ShadowDragon8685 ShadowDragon8685's picture
What year was the Fall, do you reckon?
I know the actual year isn't really stated officialy to avoid the "2001: A Space Odyssey" effect... But knowing the year is helpful, so as to know what calender to use... And there are clues. For instance, it's stated that the oldest gerontocrats are nearing 200 years old, and in Gatecrashing, when they interview that ego that was pulled out of a habitat the TITANs left behind, the ego interviewed states her date of birth as being in the year 1983. If we assume that someone who was born around 1980 (give or take a decade,) started getting access to the first anagathics when they were in their 70s-90s, that would put the first availability of anagathics to the super-rich at around 2050-2070 or so-abouts, and that would be the "near onto one century" mark. Another 90 years or so gives us 1250, and people who can be meaningfully said to be nearing their two century mark. So, I like to use 2150 as the year of the Fall, and that gives us 2160 as 10 AF, the default time frame of the setting. That lets me say things like "The game is starting on April 2nd, 10 AF/2150 for those still using the old Gregorian calendar." Obviously, the PC is trying to push the Martian calendar, but I imagine few people give a damn and are willing to switch from the calendar everyone's already using. So at most, on Mars, the official shit uses the PC calendar, and everybody's muse automatically translates it into normal-people timekeeping.
Skype and AIM names: Exactly the same as my forum name. [url=http://tinyurl.com/mfcapss]My EP Character Questionnaire[/url] [url=http://tinyurl.com/lbpsb93]Thread for my Questionnaire[/url] [url=http://tinyurl.com/obu5adp]The Five Orange Pips[/url]
Seekerofshadowlight Seekerofshadowlight's picture
Well this subject has come up
Well this subject has come up before. as you said the person in gatecrashing was born in 1988 and we know from transhuman the oldest a character can be is 130 years. if that is the case and if say you use 85 as the oldest that puts it around 2100 which is far to young for me. I like you put it later , for me its 2180 ish
Smokeskin Smokeskin's picture
T
The whole timeline is bonkers, I wouldn't worry about it. That it takes over 60 years to go from computers exceeding human capabilities to self-improving AI is silly beyond comprehension.
Patrick Northedgers Patrick Northedgers's picture
Not really given
The date of The Fall is not given directly, but it has been calculated / approximated from existing sources (several birthdates and seasons of Uranus) by forum hivemind as 2133 CE. It is, of course, not confirmed by the authors, and most likely will never be, so this is an unofficial answer. Source: [url]http://www.eclipsephase.com/when-did-fall-occur[/url] and topic linked there. Of course, before using it openly in your game, remember this obfuscation (BF / AF calendar) was done on purpose to increase immersion and show the gravity of The Fall itself.
"Normal" does not exist anymore. I consider it a good symptom, though.
ShadowDragon8685 ShadowDragon8685's picture
Patrick Northedgers wrote:The
Patrick Northedgers wrote:
The date of The Fall is not given directly, but it has been calculated / approximated from existing sources (several birthdates and seasons of Uranus) by forum hivemind as 2133 CE. It is, of course, not confirmed by the authors, and most likely will never be, so this is an unofficial answer. Source: [url]http://www.eclipsephase.com/when-did-fall-occur[/url] and topic linked there. Of course, before using it openly in your game, remember this obfuscation (BF / AF calendar) was done on purpose to increase immersion and show the gravity of The Fall itself.
I just want to know what calendar to use so I know what day of the week is when. Though I don't really mind the "10 AF" thing, I would also point out that the "AD/BC" modification to the Julian Calendar (which was in since since 46 BC,) didn't come around until 525 AD, and wasn't widely adopted until after 800. Really, people who have lived most/all of their lives knowing that it's the year 2XXX aren't going to suddenly say "This catastrophe was so bad we shall now define our calendar by it!" so much as they're going to say "21xx was the year Earth fell. 21xx was the year we lost almost everything. 21xx(+10) is the year we take back our civilization." or whatever.
Skype and AIM names: Exactly the same as my forum name. [url=http://tinyurl.com/mfcapss]My EP Character Questionnaire[/url] [url=http://tinyurl.com/lbpsb93]Thread for my Questionnaire[/url] [url=http://tinyurl.com/obu5adp]The Five Orange Pips[/url]
thezombiekat thezombiekat's picture
Patrick Northedgers wrote:
Patrick Northedgers wrote:
Of course, before using it openly in your game, remember this obfuscation (BF / AF calendar) was done on purpose to increase immersion and show the gravity of The Fall itself.
this always bothered me a bit. partially because, as others have pointed out, historically dating systems don't change rapidly without a strong authority to mandate the change (i believe some governments used ether the crowning of the current monarch, or the establishment of the dynasty as the origin point). EP lacks a single authority to define a new calendar. if the PC tried they would probably use the martian year and day, in accordance with the new home-world meme they are working so hard to spread. many of the anarchists would refuse to use such a system just because it was a PC initiative. the other problem is what day is the origin point. depending on the definition you use the fall took between a week and a couple of months. there are several psychological breakpoints during the fall you could use as the start date for the calendar. the destruction of the first space elevator. first city nuked. UN declares evacuation, establishment of the blockade. but it would be at least 3 months after the fall before any political entity with the influence to affect others will be thinking about anything as frivolous as changing the calendar. everybody was swamped with refuges and struggling to live. new calendars have traditionally had only 2 purposes. predicting the seasons to assist agriculture and trade (which the AF calendar fails to do as it remains based on the earth year) or political/ideological reinforcement. given the political situation post fall i would expect the only calendar everybody could agree on to be the one they had been using before. all that aside i accept the reason for the new calendar given in the book. they didn't want a couple of technological advances made or missed to invalidate the setting to quickly. it wont help of cause, where 2001 was overly optimistic about the future of space travel i suspect EP is overly pessimistic. the ages and birth dates given do establish a approximate date and we will exede the described technology well in advance.
ShadowDragon8685 ShadowDragon8685's picture
Hrm... I'm going to be
Hrm... I'm going to be running a game tomorrow... Today. And, well, getting my ducks lined up with my players, anyway, to see what kind of game they want. I think I'm going to say that there's three calendars in use. The PC are pushing their Martian calendar; not just for the months but also to designate year 0 as the establishment of Offworld Consortium, the precursor to the PC as we know it today, trying to establish themselves as an all-important foundational event in human history. This isn't gaining much traction anywhere the PC can't enforce it by rule of law, especially because the Martian year is nearly twice as long and has a crapload more months than the traditional Julian/Gregorian calendar, which makes timekeeping even more complicated when interacting with others. ("The contract is for one year." "Who's year? Earth's year, or Mars's year?" "Why not Mercury's year?" "[b]Earth's[/b] year - one Earth from this date.") There will be the "10 AF crowd," Firewall included, who want to designate Year 0 as the year the Fall happened; so traumatized by the Fall of Earth that they want to define the whole of transhuman existence by it. They're gaining traction among those who have never seen Earth. And then there's the " 21xx" crowd, who are nonplussed by these silly attempts at historical revisionism, and counter the "10 AF crowd" by asking why the year 1945 wasn't redesignated as 0 AN (After Nuke) because that was the year the first nuclear bomb was detonated by humanity, or citing other historical events. To them, the Fall happened in the year 21XX, we get over it and move on, but we don't play stupid and try to forget everything that happened before it, or redefine human history by it. The Jovians like this, of course - the religious ones because they still see the birth of Jesus Christ as being the definitional human moment, Fall be Damned (quite literally,) and the areligious ones because it lends them legitimacy as the evolution of pre-Fall national powers. And of course, when dealing with the larger AF crowd, there's the question of which year will they use - the old Gregorian year with the 0 redesignated, or an entirely new year, possibly designated around the sun's own cycles, or the average of the orbital year of all the main planets in the solar system (Mercury through Uranus,) and while we're at it, why don't we redesignate the day as well... Of course, nobody can agree on a [i]new[/i] calendar, so the old one sticks around by default. I think I'll go with the 2133 number as the year of the fall, making 10 AF 2143. That means that January 1st, 10 AF, is a Tuesday... It's as good a day as any to begin a year on, I guess. Muse internal clocks are probably still on the Gregorian calendar, though they can obviously do conversions to the user's preferences - and from calendars anyone else is trying to use.
Skype and AIM names: Exactly the same as my forum name. [url=http://tinyurl.com/mfcapss]My EP Character Questionnaire[/url] [url=http://tinyurl.com/lbpsb93]Thread for my Questionnaire[/url] [url=http://tinyurl.com/obu5adp]The Five Orange Pips[/url]
bibliophile20 bibliophile20's picture
Smokeskin wrote:The whole
Smokeskin wrote:
The whole timeline is bonkers, I wouldn't worry about it. That it takes over 60 years to go from computers exceeding human capabilities to self-improving AI is silly beyond comprehension.
I wouldn't be too sure. It was 60 years from first aircraft to first spaceflight, to give just one historical example.

"Democracy is two wolves and a lamb voting on what to have for lunch. Liberty is a well-armed lamb contesting the vote." -Benjamin Franklin

Smokeskin Smokeskin's picture
bibliophile20 wrote:Smokeskin
bibliophile20 wrote:
Smokeskin wrote:
The whole timeline is bonkers, I wouldn't worry about it. That it takes over 60 years to go from computers exceeding human capabilities to self-improving AI is silly beyond comprehension.
I wouldn't be too sure. It was 60 years from first aircraft to first spaceflight, to give just one historical example.
Are you aware of Moore's Law? Once you hit the human capability point, they will quickly become much faster and much cheaper at solving any task than humans. And when the AI engineers and researchers themselves are thinking faster and faster because they themselves are AI's, their speed increase will plug directly into an increase in AI development speed. And on top you should factor in that they don't just get faster from better hardware, they also get faster and smarter from software improvements. It's the classic intelligence explosion idea - but even without, you're still looking at around a trillion times performance increase in that timeframe.
thezombiekat thezombiekat's picture
well muses manufactured in PC
well muses manufactured in PC space will use the martian calander for internal prosesing, not that it matters. give some consideration to morning star, they are effectively a splinter group from PC and would have probably been transitioning from 21xx to martian calendars prior to separation. they wouldn't want to adopt a calendar designed to funnel power to a different group but will they create there own entry in the calendar wars or support a politically neutral calendar and attack the PC with there self serving manipulation of time keeping systems.
Quincey Forder Quincey Forder's picture
Year 0
For my own version of the setting, I placed the Singularity that led to the Fall in 2158. The Fall itself took several months in my reckon, one doesn't just walk in...I mean evacuate Earth and wage a total war on a mere days notice. Then comes the aftermath. The trauma, the reconstruction of Transhumanity and the development of the Planetary Consortium. This couldn't have taken less than a decade, nevermind a year! There are thousands hypercorporations in the PC, with a selected prominent few like Cognite, Experia, etc that makes the top and main focus, but that takes years to build the infrastructure, stabilize and make it accepted and functional. And so, I placed the landmark of 1AF in 2188, with the inauguration of the Planetary Consortium on that date. Big celebrations, parties, all around the Inner System. But that's the official,canon calender, adopted by others for the sake of practicality and diplomacy. Some would keep the Anno Domini as local calender, the other would use Chinese calendar, or Hebraic, or even Stardate calendar. But all would use the AF calendar for inter-habitat transactions and point of references
[center] Q U I N C E Y ^_*_^ F O R D E R [/center] Remember The Cant! [img]http://tinyurl.com/h8azy78[/img] [img]http://i249.photobucket.com/albums/gg205/tachistarfire/theeye_fanzine_us...
jackgraham jackgraham's picture
1986.
1986.
J A C K   G R A H A M :: Hooray for Earth!   http://eclipsephase.com :: twitter @jackgraham @faketsr :: Google+Jack Graham
ShadowDragon8685 ShadowDragon8685's picture
jackgraham wrote:1986.
jackgraham wrote:
1986.
The Fall of Earth, jack, not the Fall of Heavy Metal. :)
Skype and AIM names: Exactly the same as my forum name. [url=http://tinyurl.com/mfcapss]My EP Character Questionnaire[/url] [url=http://tinyurl.com/lbpsb93]Thread for my Questionnaire[/url] [url=http://tinyurl.com/obu5adp]The Five Orange Pips[/url]
Decivre Decivre's picture
Smokeskin wrote:Are you aware
Smokeskin wrote:
Are you aware of Moore's Law? Once you hit the human capability point, they will quickly become much faster and much cheaper at solving any task than humans. And when the AI engineers and researchers themselves are thinking faster and faster because they themselves are AI's, their speed increase will plug directly into an increase in AI development speed. And on top you should factor in that they don't just get faster from better hardware, they also get faster and smarter from software improvements. It's the classic intelligence explosion idea - but even without, you're still looking at around a trillion times performance increase in that timeframe.
I think he assumed you thought it was bonkers because it wasn't enough time to go from one to the next, not the other way around. Personally, I see the long length of time between the two as being caused heavily by opponents to the idea of creating smarter-than-human AI. When you think of the Fall as a series of events that culminated in the year-long destruction of Earth, it probably really started way back then... when culture and rapidly-advancing technology had its first hard ethical clashes, and the speed of progress was slowed. Also note that the 60-year period according to the timeline is in reference to the development of the TITANs. The timeline does not make note of when the Prometheans were first created... which I imagine was far earlier.
Transhumans will one day be the Luddites of the posthuman age. [url=http://bit.ly/2p3wk7c]Help me get my gaming fix, if you want.[/url]
NewtonPulsifer NewtonPulsifer's picture
Smokeskin wrote:bibliophile20
Smokeskin wrote:
bibliophile20 wrote:
Smokeskin wrote:
The whole timeline is bonkers, I wouldn't worry about it. That it takes over 60 years to go from computers exceeding human capabilities to self-improving AI is silly beyond comprehension.
I wouldn't be too sure. It was 60 years from first aircraft to first spaceflight, to give just one historical example.
Are you aware of Moore's Law? Once you hit the human capability point, they will quickly become much faster and much cheaper at solving any task than humans. And when the AI engineers and researchers themselves are thinking faster and faster because they themselves are AI's, their speed increase will plug directly into an increase in AI development speed. And on top you should factor in that they don't just get faster from better hardware, they also get faster and smarter from software improvements. It's the classic intelligence explosion idea - but even without, you're still looking at around a trillion times performance increase in that timeframe.
Moore's Law is dead on CMOS right now, though, with no replacement in sight. Source: Gordon E. Moore
"I fear all we have done is to awaken a sleeping giant and fill him with a terrible resolve."- Isoroku Yamamoto
Smokeskin Smokeskin's picture
NewtonPulsifer wrote:
NewtonPulsifer wrote:
Moore's Law is dead on CMOS right now, though, with no replacement in sight. Source: Gordon E. Moore
I know that Moore's Law was something very specific back when he formulated about numbers of transistors per square inch. I thought the meaning of "Moore's Law" these days had switched to something more along the the lines of "processing power per dollar" - if that's an accepted layman's understanding or just plain wrong, I don't know. Anyway, that's what I meant, processing power per dollar grows exponentially. That's what matters in terms of social implications. I'd expect that the technologies will change over time. I also read Kurzweil's analysis on how that same growth rate even extends way back into mechanical 20th century calculators, and while I haven't checked the numbers I haven't seen anyone refute it. If you have anything juicy on changes in Moore's Law, please link :) My most probable scenario of the future is based on it, and it reflects A LOT on how I've planned my life.
Lorsa Lorsa's picture
Personally I like the fact
Personally I like the fact that there isn't any stated reference to our current day calendar. In all the games that have done so, it usually ends up looking very weird and the game usually doesn't age very well (like Shadowrun). The few events listed as having happened in some 20-year timespans before the Fall is quite enough for me. It doesn't specify when the Promotheans were actually created. Like Decivre said, it probably was much earlier than the TITAN project. Actual processing power probably had nothing at all to do with slowing down the various super-smart AI developments, but rather a software issue combined with corporate and government politics. I imagine a lot of AIs were limited in their interction with the physical world and even destroyed. The problem with Moore's law is that it is based on a technology that has reached its limit today. There's lots of research going on trying to find the next construction method and architecture. Also there are some very real physical problems with the scales we are reaching now such as electron tunneling messing up currents. Even if we do solve these issues, there is a limit to Moore's law. We can't decrease the size of a transistor forever, or we'll eventually end up in a terrotory where we have our theoretical strings.
Lorsa is a Forum moderator [color=red]Red text is for moderator stuff[/color]
NewtonPulsifer NewtonPulsifer's picture
Smokeskin wrote
Smokeskin wrote:
NewtonPulsifer wrote:
Moore's Law is dead on CMOS right now, though, with no replacement in sight. Source: Gordon E. Moore
I know that Moore's Law was something very specific back when he formulated about numbers of transistors per square inch. I thought the meaning of "Moore's Law" these days had switched to something more along the the lines of "processing power per dollar" - if that's an accepted layman's understanding or just plain wrong, I don't know. Anyway, that's what I meant, processing power per dollar grows exponentially. That's what matters in terms of social implications. I'd expect that the technologies will change over time. I also read Kurzweil's analysis on how that same growth rate even extends way back into mechanical 20th century calculators, and while I haven't checked the numbers I haven't seen anyone refute it. If you have anything juicy on changes in Moore's Law, please link :) My most probable scenario of the future is based on it, and it reflects A LOT on how I've planned my life.
Well a caveat - I work in the semiconductor industry (software side - that is, tools that companies like Intel use to design their chips). If you read Moore's original paper there's more than just "2x transistor density every 2 years" and "2 times performance every 1.5 years": 1. Cost 2. Heat 3. number of transistors 4. distance between transistors http://dujs.dartmouth.edu/spring-2013-15th-anniversary-edition/keeping-u... We've already had companies like Nvidia skipping die shrinks which has *never* happened before in the history of the industry (because it wasn't worth it - and Intel has delayed their 14nm Broadwell due to defect density), and we already started losing the 2x performance every 1.5 years as CMOS hit a clock speed wall (about 8 years ago). And heat is not going down in a geometric fashion. Heat becomes a showstopper at 5 nanometers or so for CMOS. Gordon Moore has predicted CMOS will hit a speed bump around 2013-2018, and a wall some time after that. So far I'm seeing he's 100% correct. In 2005, Gordon Moore stated that the law “can’t continue forever. The nature of exponentials is that you push them out and eventually disaster happens”* P.S. Kurzweil's predictions are worthless. The guy doesn't even have a tractable model of the brain. His 2020 prediction of a desktop computer as powerful as a human brain is 100% not going to happen. *M. Dubash, Moore’s Law is dead, says Gordon Moore (13 April 2005). Available at http://news.techworld.com/operating-systems/3477/moores-law-is-dead-says...
"I fear all we have done is to awaken a sleeping giant and fill him with a terrible resolve."- Isoroku Yamamoto
Smokeskin Smokeskin's picture
NewtonPulsifer wrote
NewtonPulsifer wrote:
Smokeskin wrote:
NewtonPulsifer wrote:
Moore's Law is dead on CMOS right now, though, with no replacement in sight. Source: Gordon E. Moore
I know that Moore's Law was something very specific back when he formulated about numbers of transistors per square inch. I thought the meaning of "Moore's Law" these days had switched to something more along the the lines of "processing power per dollar" - if that's an accepted layman's understanding or just plain wrong, I don't know. Anyway, that's what I meant, processing power per dollar grows exponentially. That's what matters in terms of social implications. I'd expect that the technologies will change over time. I also read Kurzweil's analysis on how that same growth rate even extends way back into mechanical 20th century calculators, and while I haven't checked the numbers I haven't seen anyone refute it. If you have anything juicy on changes in Moore's Law, please link :) My most probable scenario of the future is based on it, and it reflects A LOT on how I've planned my life.
Well a caveat - I work in the semiconductor industry (software side - that is, tools that companies like Intel use to design their chips). If you read Moore's original paper there's more than just "2x transistor density every 2 years" and "2 times performance every 1.5 years": 1. Cost 2. Heat 3. number of transistors 4. distance between transistors http://dujs.dartmouth.edu/spring-2013-15th-anniversary-edition/keeping-u... We've already had companies like Nvidia skipping die shrinks which has *never* happened before in the history of the industry (because it wasn't worth it - and Intel has delayed their 14nm Broadwell due to defect density), and we already started losing the 2x performance every 1.5 years as CMOS hit a clock speed wall (about 8 years ago). And heat is not going down in a geometric fashion. Heat becomes a showstopper at 5 nanometers or so for CMOS. Gordon Moore has predicted CMOS will hit a speed bump around 2013-2018, and a wall some time after that. So far I'm seeing he's 100% correct.
Kurzweil says the same. But he believes that other technologies will take over and keep the exponential growth in performance going. What do you think of Kurzweils analysis of machine performance from 1890? According to him, the trend of exponential growth in performance has held since then through five different technology regimes. I'd link to accelerating change on wikipedia, but I fear my iPad will delete my text if I change tabs :)
Quote:
In 2005, Gordon Moore stated that the law “can’t continue forever. The nature of exponentials is that you push them out and eventually disaster happens”*
Yeah, Kurzweil also believes that. However, he believes that performance growth won't slow before computers a trillions times faster or more.
Quote:
P.S. Kurzweil's predictions are worthless. The guy doesn't even have a tractable model of the brain. His 2020 prediction of a desktop computer as powerful as a human brain is 100% not going to happen.
To be fair to Kurzweil, he predicts that human-level intelligence won't be possible until the late 2020s. The early 2020s "as powerful as the human brain" comes after you factor in that (according to him) the computer models that have been made of neural tissue so far, afterwards programmers have been able to make more efficient implementations that do the same processing roughly 1000 times faster. Basically he's saying that the brain is probably wasting 99.9% of its calculations compared to what human programmers could manage after some years of tweaking and tinkering.
Quote:
*M. Dubash, Moore’s Law is dead, says Gordon Moore (13 April 2005). Available at http://news.techworld.com/operating-systems/3477/moores-law-is-dead-says...
Thanks :)
NewtonPulsifer NewtonPulsifer's picture
My generic response would be
My generic response would be that Kurzweil only addresses price-performance of computers, which simply isn't enough. The chips will still draw around x watts each. Well, smarter design might squeeze that down a bit - but not alot. That's where you'd still get bitten. The total cost of ownership of a computer will drown you, even if the CPU cost becomes noise. It won't be the cost of the CPU that's the problem, it will be the cost of the energy to compute with it. EDIT: [Kurzweil] 2019 - The computational capacity of a $4,000 computing device (in 1999 dollars) is approximately equal to the computational capability of the human brain (20 quadrillion calculations per second) [/Kurzweil] GPUs are at 5 terraFLOPS-ish (5x10^12 - need to hit 20x10^15 by 2019, and honestly a 5 teraFLOP GPU is not as useful as a 5 teraFLOP general use computer. Still, 4,000x in 5 years?).
"I fear all we have done is to awaken a sleeping giant and fill him with a terrible resolve."- Isoroku Yamamoto