In Actionscript, the Unix timestamp in milliseconds is obtainable like this:
public static function getTimeStamp():uint
{
var now:Date = new Date();
return now.getTime();
}
The doc clearly states the following:
getTime():Number Returns the number of
milliseconds since midnight January 1,
1970, universal time, for a Date
object.
When I trace it, it returns the following:
824655597
So, 824655597 / 1000 / 60 / 60 / 24 / 365 = 0.02 years.
This is obviously not correct, as it should be around 39 years.
Question #1: What's wrong here?
Now, onto the PHP part: I'm trying to get the timestamp in milliseconds there as well. The microtime()
function returns either a string (0.29207800 1246365903) or a float (1246365134.01), depending on the given argument. Because I thought timestamps were easy, I was going to do this myself. But now that I have tried and noticed this float, and combine that with my problems in Actionscript I really have no clue.
Question #2: how should I make it returns the amount of milliseconds in a Unix timestamp?
Timestamps should be so easy, I'm probably missing something.. sorry about that. Thanks in advance.
EDIT1: Answered the first question by myself. See below.
EDIT2: Answered second question by myself as well. See below. Can't accept answer within 48 hours.
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…