I wrote the following small program to print out the Fibonacci sequence:
static void Main(string[] args)
{
Console.Write("Please give a value for n:");
Int16 n = Int16.Parse(Console.ReadLine());
Int16 firstNo = 0;
Int16 secondNo = 1;
Console.WriteLine(firstNo);
Console.WriteLine(secondNo);
for (Int16 i = 0; i < n; i++)
{
//Problem on this line
Int16 answer = firstNo + secondNo;
Console.WriteLine(answer);
firstNo = secondNo;
secondNo = answer;
}
Console.ReadLine();
}
The compilation message is:
Cannot implicitly convert type 'int' to 'short'. An explicit conversion exists (are you missing a cast?)
Since everything involved is an Int16 (short) then why are there any implicit conversions going on? And more specificially why the failure here (and not when initially assigning an int to the variable)?
An explanation would be much appreciated.
Microsoft converts your Int16
variables into Int32
when doing the add function.
Change the following:
Int16 answer = firstNo + secondNo;
into...
Int16 answer = (Int16)(firstNo + secondNo);