So I'm trying to create a function that takes in an array (I guess it's more of a JSON object, or so we were told) object and returns a value based on that array, but I keep getting an error, so I'm pretty certain I'm doing this wrong.
I'm fairly new at JavaScript so go easy on me. Also, I found this thread which is similar to the question I'm asking, but I don't quite understand THAT question (and therefore it's answers).
Here's a sample of the object we're given:
var returned_json = {
"nike_runs": [
{
"start_time": "2011-03-11T19:14:44Z",
"calories": 12.0,
"distance_miles": "0.10",
"total_seconds": 288.0,
"average_pace":"50.47"
},
{
"start_time": "2011-03-11T19:41:25Z",
"calories": 7.0,
"distance_miles": "0.06",
"total_seconds": 559.0,
"average_pace": "165.1开发者_开发百科9"
},
{
"start_time": "2011-03-11T20:27:45Z",
"calories": 197.0,
"distance_miles": "1.63",
"total_seconds": 8434.0,
"average_pace": "86.22"
},
...
]
}
Here's my code:
function getExp (returned_json) {
var exp;
for (var i = 0; i <= returned_json.nike_runs.length; i++) {
exp += returned_json.nike_runs[i].calories;
}
return exp;
}
It returns an error:
TypeError: returned_json.nike_runs[i] is undefined
I figured this has to do with the fact I'm not defining the type of object I want to pass into the function, but my research tells me that doesn't matter.
Help? :(
Thanks.
Use i < returned_json.nike_runs.length
, not i <= returned_json.nike_runs.length
.
Edit: While you’re at it, you better define a starting value for exp
too.
精彩评论