Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
160 views
in Technique[技术] by (71.8m points)

C# vs C - Big performance difference

I'm finding massive performance differences between similar code in C and C#.

The C code is:

#include <stdio.h>
#include <time.h>
#include <math.h>

main()
{
    int i;
    double root;
    
    clock_t start = clock();
    for (i = 0 ; i <= 100000000; i++){
        root = sqrt(i);
    }
    printf("Time elapsed: %f
", ((double)clock() - start) / CLOCKS_PER_SEC);   

}

And the C# (console app) is:

using System;
using System.Collections.Generic;
using System.Text;

namespace ConsoleApplication2
{
    class Program
    {
        static void Main(string[] args)
        {
            DateTime startTime = DateTime.Now;
            double root;
            for (int i = 0; i <= 100000000; i++)
            {
                root = Math.Sqrt(i);
            }
            TimeSpan runTime = DateTime.Now - startTime;
            Console.WriteLine("Time elapsed: " + Convert.ToString(runTime.TotalMilliseconds/1000));
        }
    }
}

With the above code, the C# completes in 0.328125 seconds (release version) and the C takes 11.14 seconds to run.

The C is being compiled to a Windows executable using mingw.

I've always been under the assumption that C/C++ were faster or at least comparable to C#.net. What exactly is causing the C code to run over 30 times slower?

EDIT: It does appear that the C# optimizer was removing the root as it wasn't being used. I changed the root assignment to root += and printed out the total at the end. I've also compiled the C using cl.exe with the /O2 flag set for max speed.

The results are now: 3.75 seconds for the C 2.61 seconds for the C#

The C is still taking longer, but this is acceptable.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

You must be comparing debug builds. I just compiled your C code, and got

Time elapsed: 0.000000

If you don't enable optimizations, any benchmarking you do is completely worthless. (And if you do enable optimizations, the loop gets optimized away. So your benchmarking code is flawed too. You need to force it to run the loop, usually by summing up the result or similar, and printing it out at the end)

It seems that what you're measuring is basically "which compiler inserts the most debugging overhead". And turns out the answer is C. But that doesn't tell us which program is fastest. Because when you want speed, you enable optimizations.

By the way, you'll save yourself a lot of headaches in the long run if you abandon any notion of languages being "faster" than each others. C# no more has a speed than English does.

There are certain things in the C language that would be efficient even in a naive non-optimizing compiler, and there are others that relies heavily on a compiler to optimize everything away. And of course, the same goes for C# or any other language.

The execution speed is determined by:

  • the platform you're running on (OS, hardware, other software running on the system)
  • the compiler
  • your source code

A good C# compiler will yield efficient code. A bad C compiler will generate slow code. What about a C compiler which generated C# code, which you could then run through a C# compiler? How fast would that run? Languages don't have a speed. Your code does.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...