Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
586 views
in Technique[技术] by (71.8m points)

c - MD5 hash calculates differently on server

I am running some code that I have written in C which calls the md5 hashing functionality from a hashing library that someone else wrote (md5.c & md5.h). The odd behavior I have been seeing is:

hashing working perfectly = I hash a string, and it comes out to the exact hash that I have verified it to be with multiple other sources.

  1. Hashing functionality works perfectly when compiling and running on my OSX machine and the hash that is computed is exactly as it should be.

  2. Same code, no changes is uploaded and compiled on the Linux based server and it computes a different (wrong) hash.

Does anyone have any insight on how exactly this would be possible? Its been driving crazy for the past week and I do not understand why this is even possible. I have also tested it on another machine, compiled and executed and it works perfectly. Its just when I upload it to the server that the hash is no longer correct.

The hashing functionality file can be found at: http://people.csail.mit.edu/rivest/Md5.c

SOLVED: Thanks everyone It was the 64-bit arch issue. Its mighty annoying that that slipped my mind to consider that when debugging.......

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

Try to replace (Md5.c line 41)

typedef unsigned long int UINT4;

by

typedef uint32_t UINT4;

(include stdint.h if needed)

On a 64 bits machine long int are (usually) 64 bits long instead of 32

EDIT :

I tried on a 64 bits opteron this solves the problem.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...