I have some questions about the math in the first couple pages, specifically the introduction of k. I’m not totally sure I follow exactly what’s going on.
So, my assumption is that we’re trying to model AI capacity as a function of investment, and I assume that we’re modeling this as the integral of an exponential function of base k such that
=\int{k%5Ei}di=\frac{k%5Ei}{\log(k)})
with k held constant. The integral is necessary I believe to insure that the derivative of C is positive in both the k1 scenarios.
This I believe matches the example of the nuclear chain reaction. I note here that C as I’ve defined it, is only a function of investment and tells us nothing about time or any other variable. I think it’s also true that we’ve defined C as an exponential because we’re assuming that the AI is reinvesting it’s returns. This seems to conflict with the linear relationship between investment and returns mentioned in the Chalmer’s quote
“The key issue is the “proportionality thesis” saying that among systems of certain class, an increase of δ in intelligence will yield an increase of δ in the intelligence of systems that these systems can design.”
although perhaps those deltas are not intended to be quantitative and equal.
But even then, I’m a little uncertain that my relation is correct. It is not clear to me that the sequence of logarithms obtained in the k<1 case is a result of this function. Specifically, I thought the notion of reinvestment was the motivation for choosing an exponential/logarithmic function to start with, and so I’m not clear on why reinvestment suddenly changes the behavior to that of nested logarithms. Is the logarithmic nature of our return being double counted?
I was also confused by the statement
Over the last many decades, world economic growth has been roughly exponential—growth has neither collapsed below exponential nor exploded above, implying a metaphorical k roughly equal to 1
But from my model, which I think is the correct one, this isn’t true. I feel like I understand the math from the nuclear chain reaction, but I have
int{1x}=xconstant
so that k=1 implies not exponential growth, but linear growth. Even worse, no value of k in my model is capable of making k “explode above” the exponential. I agree with the assessment that k has been slightly on the positive side, which gave me some hope I still have the correct model, but then I got really discouraged by the fact that ki for money is on the order of 1.02 while ki for the neutrons in the nuclear pile was 1.006. The implication from k values alone is that my bank account is somehow more explosive than a large pile of Uranium. Unfortunately this is not true, and so it seems like my model needs to account not only for C as a function of i, but C as a function of time as well.
This issue really comes into play with the prompt critical AI. One of the ways prompt critical AI is deemed capable of growing exponentially smarter is by stealing access to more hardware. Having this as an option challenges either the definition of investment or seriously challenges the notion of constant k. Even in the limit that solving AI problems is exponentially hard (k1 coupled to a short generation time?
I’m really terrible at LW formatting/writing in tiny comment boxes, so if I screwed this up to the point of being confusing let me know.
I have some questions about the math in the first couple pages, specifically the introduction of k. I’m not totally sure I follow exactly what’s going on.
So, my assumption is that we’re trying to model AI capacity as a function of investment, and I assume that we’re modeling this as the integral of an exponential function of base k such that
=\int{k%5Ei}di=\frac{k%5Ei}{\log(k)})with k held constant. The integral is necessary I believe to insure that the derivative of C is positive in both the k1 scenarios. This I believe matches the example of the nuclear chain reaction. I note here that C as I’ve defined it, is only a function of investment and tells us nothing about time or any other variable. I think it’s also true that we’ve defined C as an exponential because we’re assuming that the AI is reinvesting it’s returns. This seems to conflict with the linear relationship between investment and returns mentioned in the Chalmer’s quote
although perhaps those deltas are not intended to be quantitative and equal.
But even then, I’m a little uncertain that my relation is correct. It is not clear to me that the sequence of logarithms obtained in the k<1 case is a result of this function. Specifically, I thought the notion of reinvestment was the motivation for choosing an exponential/logarithmic function to start with, and so I’m not clear on why reinvestment suddenly changes the behavior to that of nested logarithms. Is the logarithmic nature of our return being double counted?
I was also confused by the statement
But from my model, which I think is the correct one, this isn’t true. I feel like I understand the math from the nuclear chain reaction, but I have
int{1x}=x constant
so that k=1 implies not exponential growth, but linear growth. Even worse, no value of k in my model is capable of making k “explode above” the exponential. I agree with the assessment that k has been slightly on the positive side, which gave me some hope I still have the correct model, but then I got really discouraged by the fact that ki for money is on the order of 1.02 while ki for the neutrons in the nuclear pile was 1.006. The implication from k values alone is that my bank account is somehow more explosive than a large pile of Uranium. Unfortunately this is not true, and so it seems like my model needs to account not only for C as a function of i, but C as a function of time as well.
This issue really comes into play with the prompt critical AI. One of the ways prompt critical AI is deemed capable of growing exponentially smarter is by stealing access to more hardware. Having this as an option challenges either the definition of investment or seriously challenges the notion of constant k. Even in the limit that solving AI problems is exponentially hard (k1 coupled to a short generation time?
I’m really terrible at LW formatting/writing in tiny comment boxes, so if I screwed this up to the point of being confusing let me know.