I would like to create a sequence, where:
a0 = 1100000
a1 = 1100000 * 1.012 - 25000
a2 = (1100000 * 1.012 - 25000) * 1.012 - 25000
a3 = (1100000 * 1.012 - 25000) * 1.012 - 25000) * 1.012 - 25000
...
Problem is, it only calculates/displays a1
(1088200), one hundred times. (I want to store it all in an array called ms)
Below is the code I've come up with:
double interest = 1.012;
int[] ms = new int[100];
for (int i = 0; i < ms.Length; i++)
{
int a0 = 1100000;
ms[i] = Convert.ToInt32(a0 * interest - 25000);
a0 = ms[i];
Console.WriteLine(ms[i]);
}
Console.ReadLine();
You're declaring a new a0
variable in each iteration of the loop, and always initializing it with the value 1100000. If you want to use the value from the previous iteration within the loop, you need to declare it outside the loop:
int a0 = 1100000;
for (int i = 0; i < ms.Length; i++)
{
ms[i] = Convert.ToInt32(a0 * interest - 25000);
a0 = ms[i];
Console.WriteLine(ms[i]);
}
Note that yo'll be losing precision on each iteration - you might be best to keep a0
as a double
(or decimal
) and only cast when storing:
decimal interest = 1.012m;
decimal current = 1100000;
for (int i = 0; i < ms.Length; i++)
{
current = current * interest - 25000;
ms[i] = (int) current;
Console.WriteLine(ms[i]);
}
See more on this question at Stackoverflow