I'm sure there's a reason I have to add three zeros to every Unix timestamp in JavaScript in order to get the correct date. Can yo开发者_开发问答u tell me why? Is it as simple as milliseconds since the epoch vs. seconds?
Because Javascript uses milliseconds internally, while normal UNIX timestamps are usually in seconds.
Javascript uses the number of milliseconds since epoch. Unix timestamp is seconds since epoch.
Hence, the need to convert Unix timestamp into millseconds before using it in Javascript
Unix time is the number of seconds since the epoch (1 Jan 1970). In Javascript, the Date
object expects the number of milliseconds since the epoch, hence the 1000-fold difference.
精彩评论