Spreadsheets (MS Excel, Google Apps) represent dates as the number of whole days since Jan 1 1900 (possibly caveat a Feb 29 odditiy in Excel's case). OK, so it's 365 days except on leap years. But that's too much arithmetic already.
Presumably, java.util.[Gregorian]Calendar
knows all this stuff. The problem is, I don't know how to access it's knowledge.
In a speculative world, one might:
myGcalEarlier.set(1900, Calendar.JANUAR开发者_运维知识库Y, 1);
myGcalLater.set(new Date());
long days1 = myGcalEarlier.mysteryMethod();
long days2 = myGcalLater.mysteryMethod();
long days = days2 - days1;
Sadly, Calendar.get(Calendar.DAYS_IN_YEAR)
doesn't satisfy for 'mysteryMethod' - it would need a Calendar.DAYS_EVER
field to do what I want.
Is there an API for getting an accurate difference expressed in calendar days?
Notes
I really do want calendar days, and not days-of-86400-seconds. Time zones and daylight-savings matters aside (thanks @Dipmedeep), leap years need to be considered. 31536000 seconds is 365 days in these terms. 3 out of 4 years, that gets me from Jan 1 to Jan1. But on the 4th year, it only gets me from Jan 1 to Dec 31, giving me a 1-day error for every 4 years!
I already have a solution for getting the number of calendar days. It's a trivial bit of code to migrate to Java, and it gets the desired answer (although I don't understand it, and therefore distrust it). This question is specifically asking (now even moreso after editing) if I can at all avoid doing those calculations and defer it to a 'trusted' library in the JDK. I have thus far concluded 'no'.
This is a pretty dumb and inefficient way of achieving your goal, but it could be used to validate other techniques
public static void main(String[] args) {
Calendar now = Calendar.getInstance();
//now.setTime(new Date()); // set the date you want to calculate the days for
Calendar tmp = Calendar.getInstance();
tmp.set(0,0,0); // init a temporary calendar.
int days=0;
// iterate from 1900 and check how many days are in the year, sum the days
for (int i=1900; i < now.get(Calendar.YEAR);i++) {
tmp.set(Calendar.YEAR, i);
int daysForThatYear = tmp.getActualMaximum(Calendar.DAY_OF_YEAR);
days+=daysForThatYear;
System.out.printf("year:%4d days in the year:%3d, total days:%6d\n",i,daysForThatYear,days);
}
// check the number of days for the current year, and add to the total of days
int daysThisYear = now.get(Calendar.DAY_OF_YEAR);
days+=daysThisYear;
System.out.printf("year:%4d days in the year:%3d, total days:%6d\n",now.get(Calendar.YEAR),daysThisYear,days);
}
Little knowledge about the specifics of java's date API here, but if you can find a method that gives you Unix timestamps, you should be able to figure it out - a Unix timestamp is the number of seconds since epoch (Jan 1st, 1970, 0:00:00 UTC), so all you need to do is find the Unix timestamps for both dates, subtract, divide by 86400 (the number of seconds in a day) and cut off the fractional part.
The same can be done for any other linear representation of the time points - all you need to know is how to convert to that linear representation, and how many units there are in a day.
you may use myGcalEarlier.getTimeInMillis()
and myGcalLater.getTimeInMillis()
and then convert to days by dividing on milliseconds in a day count. 24*60*60*1000.
and your first set call is wrong.
set(int year, int month, int date)
month is 0-based 0 for january
GregorianCalendar myGcalEarlier = new GregorianCalendar(); GregorianCalendar myGcalLater = new GregorianCalendar(); myGcalEarlier.set(1900, Calendar.JANUARY, 1);
long lTime1 = myGcalEarlier.getTimeInMillis(); long lTime2 = myGcalLater.getTimeInMillis();
long days = (lTime2 - lTime1)/(24*60*60*1000);
精彩评论