The following code seems like very simple, but I have no idea why it generates a "heap out of memory" when size is bigger. The nodejs v18.12.1 is using.
#!/usr/bin/node
import fs from "node:fs"
import path from "node:path"
import http from 'node:http'
const port = process.argv[2] || 8080
const baseDir = process.argv[3] || "/tmp/"
function log(msg){
if(!msg) return
let now = new Date()
let ts = now.toLocaleTimeString()
let tsdate = now.toLocaleDateString().replace(/\//ig,"-")
fs.appendFile(path.join(baseDir,"fileUpload" + tsdate + ".log"), ts + " " + msg + "\n", (err)=>{
if(err) console.log(err)
})
}
http.createServer((req, res) => {
if (req.url != '/generate' || req.method !== 'GET') {
log(req.headers.host + " " + req.method + " " + req.url + " " + req.headers.size + " usage_fail")
res.writeHead(200, { 'Content-Type': 'text/plain' });
return res.end("wrong usage\n")
}
res.writeHead(200, { 'Content-Type': 'text/plain' });
for(let i = 0; i<req.headers.size; ++i ){
res.write("-")
}
return res.end()
}).listen(port, (err) => {
console.log('Server listening on http://localhost:' + port,err)
})
/*
curl -o /tmp/zfile.txt -H "size:100" http://localhost:8080/generate
*/
The exact error message is described as follows.
<--- Last few GCs --->
[3886618:0x1c854110] 87940 ms: Mark-sweep (reduce) 2028.6 (2058.0) -> 2028.3 (2058.0) MB, 8189.7 / 0.0 ms (+ 45.6 ms in 5 steps since start of marking, biggest step 13.3 ms, walltime since start of marking 8260 ms) (average mu = 0.112, current mu = 0.[3886618:0x1c854110] 88011 ms: Scavenge (reduce) 2035.4 (2064.3) -> 2034.4 (2064.3) MB, 5.9 / 0.0 ms (average mu = 0.112, current mu = 0.045) allocation failure;
<--- JS stacktrace --->
FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
1: 0xb5eb0c node::Abort() [/usr/bin/node]
2: 0xa81dc0 void node::FPrintF<>(_IO_FILE*, char const*) [/usr/bin/node]
3: 0xd1ee70 v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [/usr/bin/node]
4: 0xd1f040 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char cons开发者_如何学运维t*, bool) [/usr/bin/node]
5: 0xefd37c [/usr/bin/node]
6: 0xefdfe4 v8::internal::Heap::RecomputeLimits(v8::internal::GarbageCollector) [/usr/bin/node]
7: 0xf0e534 [/usr/bin/node]
8: 0xf0f0f8 v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [/usr/bin/node]
9: 0xeeb490 v8::internal::HeapAllocator::AllocateRawWithLightRetrySlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [/usr/bin/node]
10: 0xeec468 v8::internal::HeapAllocator::AllocateRawWithRetryOrFailSlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [/usr/bin/node]
11: 0xecf0f8 v8::internal::Factory::NewFillerObject(int, v8::internal::AllocationAlignment, v8::internal::AllocationType, v8::internal::AllocationOrigin) [/usr/bin/node]
12: 0x127422c v8::internal::Runtime_AllocateInOldGeneration(int, unsigned long*, v8::internal::Isolate*) [/usr/bin/node]
13: 0x165bbcc [/usr/bin/node]
Aborted (core dumped)
The curl command can assign a size in header. The size indicates how many bytes will be generated to be downloaded. The code results very well when the size is very small, such as less than 1000. However, when the size is larger and larger, such as 99999999, the server side crashes, and generate the above error messages.
I know that manually increasing the heap memory usage of nodejs is a possible solution, but the method should not be good enough. The core of the code is a simple for loop, and write() function. Why the code generates allocation problem?
I tried the above code on a Jetson Xavier NX which has ARM-based CPU and 8 GB memory. Nodejs is v18.12.1. I expect this code can generate arbitrary size, at least 1 GB, of file content to be downloaded.
精彩评论