Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OSX tests are stuck #3887

Open
RafaelGSS opened this issue Sep 2, 2024 · 27 comments
Open

OSX tests are stuck #3887

RafaelGSS opened this issue Sep 2, 2024 · 27 comments

Comments

@RafaelGSS
Copy link
Member

Hey folks,

I'm trying to finish the CI for nodejs/node#54560 (target is today) and osx test is taking too long (it seems stuck) >12h~

@RafaelGSS
Copy link
Member Author

Alpine might be facing a similar problem. It's stuck on CITGM: https://ci.nodejs.org/job/citgm-smoker/3472/

@richardlau
Copy link
Member

Hey folks,

I'm trying to finish the CI for nodejs/node#54560 (target is today) and osx test is taking too long (it seems stuck) >12h~

It's not stuck -- https://ci.nodejs.org/job/node-test-commit-osx/60982/ hasn't started yet (it's been waiting for a osx11-x64 machine for 12 hours).

https://ci.nodejs.org/job/node-test-commit-osx/nodes=osx11-x64/ is incredibly backlogged -- looks like we only have one machine serving jobs.

@richardlau
Copy link
Member

Running the cleanup script on test-orka-macos11-x64-2.

@richardlau
Copy link
Member

test-orka-macos11-x64-2 is back online and processing jobs. Backlog is currently 16 queued jobs (+1 citgm job) for osx11-x64.

@aduh95
Copy link
Contributor

aduh95 commented Sep 2, 2024

Getting java.io.IOException: No space left on device on test-orka-macos11-x64-1

e.g.: https://ci.nodejs.org/job/node-test-commit-osx/61013/nodes=osx11-x64/console

@richardlau
Copy link
Member

Ran the cleanup script on test-orka-macos11-x64-1.

@targos
Copy link
Member

targos commented Sep 5, 2024

Disk space is too low again on test-orka-macos11-x64-2. What's happening?

@targos
Copy link
Member

targos commented Sep 5, 2024

Now both nodes are offline. We can't keep cleaning them manually every day.

@richardlau
Copy link
Member

Have the macOS Node.js builds got considerably larger?

18 GB (#3878 (comment)) sounds really high -- builds on Linux are only ~3 GB.

@targos
Copy link
Member

targos commented Sep 5, 2024

According to my local folders:

 18G	canary/out/Release
 18G	node/out/Release
 17G	v20.x/out/Release
 18G	v22.x/out/Release

I don't have a v18.x build.

@richardlau
Copy link
Member

I've run the cleanup script on test-orka-macos11-x64-2 and rebooted the machine. This is now reporting ~21GB of free space -- a node-test-commit-osx run has started and it is currently consuming 1.8 GB of that (and would be expected to grow to 18 GB).

test-orka-macos11-x64-2:~ iojs$ du -hs build/workspace/node-test-commit-osx/
1.8G    build/workspace/node-test-commit-osx/
test-orka-macos11-x64-2:~ iojs$ df -h
Filesystem       Size   Used  Avail Capacity iused     ifree %iused  Mounted on
/dev/disk2s5s1   90Gi   14Gi   21Gi    41%  553788 941116212    0%   /
devfs           188Ki  188Ki    0Bi   100%     650         0  100%   /dev
/dev/disk2s4     90Gi  1.0Mi   21Gi     1%       1 941669999    0%   /System/Volumes/VM
/dev/disk2s2     90Gi  305Mi   21Gi     2%    1038 941668962    0%   /System/Volumes/Preboot
/dev/disk2s6     90Gi  592Ki   21Gi     1%      17 941669983    0%   /System/Volumes/Update
/dev/disk2s1     90Gi   54Gi   21Gi    73%  529196 941140804    0%   /System/Volumes/Data
map auto_home     0Bi    0Bi    0Bi   100%       0         0  100%   /System/Volumes/Data/home
test-orka-macos11-x64-2:~ iojs$

@richardlau
Copy link
Member

I suppose one other question -- where is the tmp dir on the macOS machines? /tmp/ looks surprisingly empty when we know that the tests are leaving behind node-coverage-* directories (#3864 -- I assume behaviour on macOS would be the same and these directories are being written somewhere).

@targos
Copy link
Member

targos commented Sep 5, 2024

echo $TMPDIR should print the location. I think the actual value is random and different on each macOS installation.

@richardlau
Copy link
Member

$ ssh test-orka-macos11-x64-1
Last login: Thu Sep  5 09:00:41 2024 from 172.16.44.16
administrator@test-orka-macos11-x64-1 ~ % sudo -s su - iojs

The default interactive shell is now zsh.
To update your account to use zsh, please run `chsh -s /bin/zsh`.
For more details, please visit https://support.apple.com/kb/HT208050.
test-orka-macos11-x64-1:~ iojs$ echo $TMPDIR

test-orka-macos11-x64-1:~ iojs$ echo $tempdir

test-orka-macos11-x64-1:~ iojs$ echo $TMP

test-orka-macos11-x64-1:~ iojs$ echo $TEMP

test-orka-macos11-x64-1:~ iojs$

🤷

@richardlau
Copy link
Member

Ah in https://ci.nodejs.org/job/node-test-commit-osx/nodes=osx11-x64/61152/injectedEnvVars/

TMPDIR /var/folders/7l/fb92_ds12k19tryhky7zbfnw0000gp/T/
test-orka-macos11-x64-1:~ iojs$ du -hs /var/folders/7l/fb92_ds12k19tryhky7zbfnw0000gp/T/
377M    /var/folders/7l/fb92_ds12k19tryhky7zbfnw0000gp/T/
test-orka-macos11-x64-1:~ iojs$

@richardlau
Copy link
Member

I've run the cleanup script on test-orka-macos11-x64-2 and rebooted the machine. This is now reporting ~21GB of free space -- a node-test-commit-osx run has started and it is currently consuming 1.8 GB of that (and would be expected to grow to 18 GB).

test-orka-macos11-x64-2:~ iojs$ du -hs build/workspace/node-test-commit-osx/
1.8G    build/workspace/node-test-commit-osx/
test-orka-macos11-x64-2:~ iojs$ df -h
Filesystem       Size   Used  Avail Capacity iused     ifree %iused  Mounted on
/dev/disk2s5s1   90Gi   14Gi   21Gi    41%  553788 941116212    0%   /
devfs           188Ki  188Ki    0Bi   100%     650         0  100%   /dev
/dev/disk2s4     90Gi  1.0Mi   21Gi     1%       1 941669999    0%   /System/Volumes/VM
/dev/disk2s2     90Gi  305Mi   21Gi     2%    1038 941668962    0%   /System/Volumes/Preboot
/dev/disk2s6     90Gi  592Ki   21Gi     1%      17 941669983    0%   /System/Volumes/Update
/dev/disk2s1     90Gi   54Gi   21Gi    73%  529196 941140804    0%   /System/Volumes/Data
map auto_home     0Bi    0Bi    0Bi   100%       0         0  100%   /System/Volumes/Data/home
test-orka-macos11-x64-2:~ iojs$

So this is now:

test-orka-macos11-x64-2:~ iojs$ df -h
Filesystem       Size   Used  Avail Capacity iused     ifree %iused  Mounted on
/dev/disk2s5s1   90Gi   14Gi  715Mi    96%  553788 941116212    0%   /
devfs           188Ki  188Ki    0Bi   100%     650         0  100%   /dev
/dev/disk2s4     90Gi  1.0Gi  715Mi    59%       2 941669998    0%   /System/Volumes/VM
/dev/disk2s2     90Gi  305Mi  715Mi    30%    1038 941668962    0%   /System/Volumes/Preboot
/dev/disk2s6     90Gi  592Ki  715Mi     1%      17 941669983    0%   /System/Volumes/Update
/dev/disk2s1     90Gi   73Gi  715Mi   100%  552325 941117675    0%   /System/Volumes/Data
map auto_home     0Bi    0Bi    0Bi   100%       0         0  100%   /System/Volumes/Data/home
/dev/disk2s5     90Gi   14Gi  715Mi    96%  553788 941116212    0%   /private/tmp/msu-target-847dorJ0
test-orka-macos11-x64-2:~ iojs$

@richardlau
Copy link
Member

I've run the cleanup script on test-orka-macos11-x64-2 and rebooted the machine. This is now reporting ~21GB of free space -- a node-test-commit-osx run has started and it is currently consuming 1.8 GB of that (and would be expected to grow to 18 GB).

test-orka-macos11-x64-2:~ iojs$ du -hs build/workspace/node-test-commit-osx/
1.8G    build/workspace/node-test-commit-osx/
test-orka-macos11-x64-2:~ iojs$ df -h
Filesystem       Size   Used  Avail Capacity iused     ifree %iused  Mounted on
/dev/disk2s5s1   90Gi   14Gi   21Gi    41%  553788 941116212    0%   /
devfs           188Ki  188Ki    0Bi   100%     650         0  100%   /dev
/dev/disk2s4     90Gi  1.0Mi   21Gi     1%       1 941669999    0%   /System/Volumes/VM
/dev/disk2s2     90Gi  305Mi   21Gi     2%    1038 941668962    0%   /System/Volumes/Preboot
/dev/disk2s6     90Gi  592Ki   21Gi     1%      17 941669983    0%   /System/Volumes/Update
/dev/disk2s1     90Gi   54Gi   21Gi    73%  529196 941140804    0%   /System/Volumes/Data
map auto_home     0Bi    0Bi    0Bi   100%       0         0  100%   /System/Volumes/Data/home
test-orka-macos11-x64-2:~ iojs$

So this is now:

test-orka-macos11-x64-2:~ iojs$ df -h
Filesystem       Size   Used  Avail Capacity iused     ifree %iused  Mounted on
/dev/disk2s5s1   90Gi   14Gi  715Mi    96%  553788 941116212    0%   /
devfs           188Ki  188Ki    0Bi   100%     650         0  100%   /dev
/dev/disk2s4     90Gi  1.0Gi  715Mi    59%       2 941669998    0%   /System/Volumes/VM
/dev/disk2s2     90Gi  305Mi  715Mi    30%    1038 941668962    0%   /System/Volumes/Preboot
/dev/disk2s6     90Gi  592Ki  715Mi     1%      17 941669983    0%   /System/Volumes/Update
/dev/disk2s1     90Gi   73Gi  715Mi   100%  552325 941117675    0%   /System/Volumes/Data
map auto_home     0Bi    0Bi    0Bi   100%       0         0  100%   /System/Volumes/Data/home
/dev/disk2s5     90Gi   14Gi  715Mi    96%  553788 941116212    0%   /private/tmp/msu-target-847dorJ0
test-orka-macos11-x64-2:~ iojs$

And now that the build completed (and a new one started):

test-orka-macos11-x64-2:~ iojs$ df -h
Filesystem       Size   Used  Avail Capacity iused     ifree %iused  Mounted on
/dev/disk2s5s1   90Gi   14Gi   19Gi    43%  553788 941116212    0%   /
devfs           189Ki  189Ki    0Bi   100%     654         0  100%   /dev
/dev/disk2s4     90Gi  1.0Gi   19Gi     5%       2 941669998    0%   /System/Volumes/VM
/dev/disk2s2     90Gi  305Mi   19Gi     2%    1038 941668962    0%   /System/Volumes/Preboot
/dev/disk2s6     90Gi  592Ki   19Gi     1%      17 941669983    0%   /System/Volumes/Update
/dev/disk2s1     90Gi   54Gi   19Gi    74%  530193 941139807    0%   /System/Volumes/Data
map auto_home     0Bi    0Bi    0Bi   100%       0         0  100%   /System/Volumes/Data/home
/dev/disk2s5     90Gi   14Gi   19Gi    43%  553788 941116212    0%   /private/tmp/msu-target-847dorJ0
test-orka-macos11-x64-2:~ iojs$

12 MB left behind in the tmp dir:

test-orka-macos11-x64-2:~ iojs$ du -hs /var/folders/7l/fb92_ds12k19tryhky7zbfnw0000gp/T/
 12M    /var/folders/7l/fb92_ds12k19tryhky7zbfnw0000gp/T/
test-orka-macos11-x64-2:~ iojs$ ls -al /var/folders/7l/fb92_ds12k19tryhky7zbfnw0000gp/T/
total 16
drwx------@ 37 iojs  iojs  1184 Sep  5 09:37 .
drwxr-xr-x@  6 iojs  iojs   192 Sep 30  2021 ..
drwx------   2 iojs  iojs    64 Sep  5 08:52 com.apple.trustd
drwxr-xr-x   3 iojs  iojs    96 Sep  5 08:44 hsperfdata_iojs
-rw-r--r--   1 iojs  iojs   397 Sep  5 09:37 jenkins4195096154793581525.sh
drwx------   4 iojs  iojs   128 Sep  5 09:12 node-coverage-2fmKoj
drwx------   4 iojs  iojs   128 Sep  5 09:12 node-coverage-3mqX4H
drwx------   4 iojs  iojs   128 Sep  5 09:12 node-coverage-6LgS3y
drwx------   4 iojs  iojs   128 Sep  5 09:12 node-coverage-8RTx0H
drwx------   4 iojs  iojs   128 Sep  5 09:12 node-coverage-8nxLOK
drwx------   3 iojs  iojs    96 Sep  5 09:12 node-coverage-Ai2iwh
drwx------   4 iojs  iojs   128 Sep  5 09:12 node-coverage-EXbzD6
drwx------   4 iojs  iojs   128 Sep  5 09:12 node-coverage-Fk5G7r
drwx------   4 iojs  iojs   128 Sep  5 09:12 node-coverage-HtpbZh
drwx------   4 iojs  iojs   128 Sep  5 09:12 node-coverage-ImTLVV
drwx------   5 iojs  iojs   160 Sep  5 09:12 node-coverage-J8cvBr
drwx------   3 iojs  iojs    96 Sep  5 09:12 node-coverage-Mcsq9Y
drwx------   4 iojs  iojs   128 Sep  5 09:12 node-coverage-Ortes7
drwx------   4 iojs  iojs   128 Sep  5 09:12 node-coverage-SCRlbv
drwx------   4 iojs  iojs   128 Sep  5 09:12 node-coverage-TyJJeu
drwx------   4 iojs  iojs   128 Sep  5 09:12 node-coverage-UXhwg1
drwx------   4 iojs  iojs   128 Sep  5 09:12 node-coverage-W04ql9
drwx------   4 iojs  iojs   128 Sep  5 09:12 node-coverage-ZegWw1
drwx------   4 iojs  iojs   128 Sep  5 09:12 node-coverage-b5JMMn
drwx------   4 iojs  iojs   128 Sep  5 09:12 node-coverage-f9Pu43
drwx------   4 iojs  iojs   128 Sep  5 09:12 node-coverage-jM9RNR
drwx------   4 iojs  iojs   128 Sep  5 09:12 node-coverage-lVtNgC
drwx------   3 iojs  iojs    96 Sep  5 09:12 node-coverage-nBlgXh
drwx------   4 iojs  iojs   128 Sep  5 09:12 node-coverage-nOIC1s
drwx------   4 iojs  iojs   128 Sep  5 09:12 node-coverage-q4m9IL
drwx------   4 iojs  iojs   128 Sep  5 09:12 node-coverage-r17v5y
drwx------   4 iojs  iojs   128 Sep  5 09:12 node-coverage-r8REsy
drwx------   5 iojs  iojs   160 Sep  5 09:12 node-coverage-tQ8VE4
drwx------   3 iojs  iojs    96 Sep  5 09:12 node-coverage-tWoSur
drwx------   4 iojs  iojs   128 Sep  5 09:12 node-coverage-wG0pDU
drwx------   4 iojs  iojs   128 Sep  5 09:12 node-coverage-yXaSm0
-rw-------   1 iojs  iojs  1285 Sep  5 09:13 xcrun_db
test-orka-macos11-x64-2:~ iojs$

@richardlau
Copy link
Member

I ran the cleanup script again on test-orka-macos11-x64-1.

FWIW The temp dir was 389MB:

administrator@test-orka-macos11-x64-1 ~ % sudo du -hs /var/folders/7l/fb92_ds12k19tryhky7zbfnw0000gp/T
389M    /var/folders/7l/fb92_ds12k19tryhky7zbfnw0000gp/T
administrator@test-orka-macos11-x64-1 ~ %

I've added a small bit of cleanup to node-test-commit-osx:

if [ -n "${TMPDIR+x}" ] && [ -d ${TMPDIR} ]; then
  rm -rf "${TMPDIR}/node-coverage*"
fi

@aduh95
Copy link
Contributor

aduh95 commented Sep 7, 2024

It looks like both runners are offline (again). Should we disable OSX on main until we can figure something out?

@jasnell
Copy link
Member

jasnell commented Sep 8, 2024

Should we disable OSX on main until we can figure something out?

Yes please. This is getting to be a real problem.

cjihrig added a commit to cjihrig/node that referenced this issue Sep 9, 2024
The test runner's code coverage leaves old coverage data in the
temp directory. This commit updates the cleanup logic to:

- Stop code collection. Otherwise V8 would write collection data
  again when the process exits.
- Remove the temp directory containing the coverage data.
- Attempt to clean up the coverage data even if parsing the
  data resulted in an error.

With this change, I no longer see any coverage data left behind
in the system temp directory.

Refs: nodejs/build#3864
Refs: nodejs/build#3887
cjihrig added a commit to cjihrig/node that referenced this issue Sep 9, 2024
The test runner's code coverage leaves old coverage data in the
temp directory. This commit updates the cleanup logic to:

- Stop code collection. Otherwise V8 would write collection data
  again when the process exits.
- Remove the temp directory containing the coverage data.
- Attempt to clean up the coverage data even if parsing the
  data resulted in an error.

With this change, I no longer see any coverage data left behind
in the system temp directory.

Refs: nodejs/build#3864
Refs: nodejs/build#3887
@mohd-akram
Copy link

According to my local folders:

 18G	canary/out/Release
 18G	node/out/Release
 17G	v20.x/out/Release
 18G	v22.x/out/Release

I don't have a v18.x build.

What in those folders is taking so much space? The Node.js build in MacPorts is failing for the same reason.

@targos
Copy link
Member

targos commented Sep 9, 2024

$ du -sh out/Release
 18G	out/Release
$ find out/Release -type f -exec du -h {} \; | sort -rh | head -n 50
$ find out/Release -type f -exec du -h {} \; | sort -rh | head -n 50
3.8G	out/Release/libv8_base_without_compiler.a
2.6G	out/Release/libv8_initializers.a
938M	out/Release/libv8_compiler.a
489M	out/Release/libv8_turboshaft.a
476M	out/Release/libnode.a
104M	out/Release/cctest
102M	out/Release/node_mksnapshot
102M	out/Release/node
102M	out/Release/embedtest
 89M	out/Release/mksnapshot
 89M	out/Release/libicutools.a
 72M	out/Release/gen/icudt75_dat.S
 70M	out/Release/libtorque_base.a
 62M	out/Release/libicui18n.a
 41M	out/Release/obj/deps/v8/src/compiler/turboshaft/v8_turboshaft.csa-optimize-phase.o
 31M	out/Release/gen-regexp-special-case
 29M	out/Release/obj/tools/icu/gen/icudata.icudt75_dat.o
 29M	out/Release/libv8_initializers_slow.a
 29M	out/Release/libicudata.a
 29M	out/Release/gen/icudt75l.dat
 29M	out/Release/gen/icudt75.dat
 26M	out/Release/obj/deps/v8/src/compiler/turboshaft/v8_turboshaft.maglev-graph-building-phase.o
 25M	out/Release/libicuucx.a
 23M	out/Release/obj/deps/v8/src/compiler/v8_compiler.pipeline.o
 22M	out/Release/obj/deps/v8/src/wasm/v8_base_without_compiler.turboshaft-graph-interface.o
 22M	out/Release/libopenssl.a
 21M	out/Release/obj/deps/v8/src/objects/v8_base_without_compiler.elements.o
 21M	out/Release/obj/deps/v8/src/compiler/turboshaft/v8_turboshaft.store-store-elimination-phase.o
 21M	out/Release/obj/deps/v8/src/compiler/turboshaft/v8_turboshaft.machine-lowering-phase.o
 21M	out/Release/obj/deps/v8/src/compiler/turboshaft/v8_compiler.wasm-optimize-phase.o
 20M	out/Release/obj/deps/v8/src/maglev/v8_base_without_compiler.maglev-graph-builder.o
 20M	out/Release/obj/deps/v8/src/compiler/turboshaft/v8_turboshaft.type-assertions-phase.o
 19M	out/Release/obj/deps/v8/src/compiler/turboshaft/v8_turboshaft.typed-optimizations-phase.o
 19M	out/Release/obj/deps/v8/src/compiler/turboshaft/v8_turboshaft.pipelines.o
 19M	out/Release/obj/deps/v8/src/compiler/turboshaft/v8_turboshaft.optimize-phase.o
 19M	out/Release/obj/deps/v8/src/compiler/turboshaft/v8_turboshaft.loop-unrolling-phase.o
 19M	out/Release/obj/deps/v8/src/codegen/v8_initializers.code-stub-assembler.o
 18M	out/Release/obj/tools/v8_gypfiles/gen/torque-generated/test/torque/v8_initializers.test-torque-tq-csa.o
 18M	out/Release/obj/deps/v8/src/maglev/v8_base_without_compiler.maglev-compiler.o
 18M	out/Release/obj/deps/v8/src/compiler/turboshaft/v8_turboshaft.loop-peeling-phase.o
 18M	out/Release/obj/deps/v8/src/compiler/turboshaft/v8_turboshaft.code-elimination-and-simplification-phase.o
 18M	out/Release/obj/deps/v8/src/compiler/turboshaft/v8_compiler.wasm-lowering-phase.o
 18M	out/Release/obj/deps/v8/src/compiler/turboshaft/v8_base_without_compiler.wasm-dead-code-elimination-phase.o
 17M	out/Release/obj/tools/v8_gypfiles/gen/torque-generated/src/builtins/v8_initializers_slow.js-to-wasm-tq-csa.o
 17M	out/Release/obj/tools/v8_gypfiles/gen/torque-generated/src/builtins/v8_initializers.cast-tq-csa.o
 17M	out/Release/obj/deps/v8/src/compiler/turboshaft/v8_turboshaft.graph-builder.o
 17M	out/Release/obj/deps/v8/src/compiler/turboshaft/v8_turboshaft.block-instrumentation-phase.o
 17M	out/Release/obj/deps/v8/src/compiler/turboshaft/v8_compiler.wasm-gc-optimize-phase.o
 16M	out/Release/obj/tools/v8_gypfiles/gen/torque-generated/third_party/v8/builtins/v8_initializers.array-sort-tq-csa.o
 16M	out/Release/obj/deps/v8/src/maglev/v8_base_without_compiler.maglev-ir.o

@targos
Copy link
Member

targos commented Sep 9, 2024

As a comparison, here are the numbers for a V8 build following the official documentation on the same machine:

$ du -sh out/arm64.release
5.4G	out/arm64.release
$ find out/arm64.release -type f -exec du -h {} \; | sort -rh | head -n 50
$ find out/arm64.release -type f -exec du -h {} \; | sort -rh | head -n 50
 60M	out/arm64.release/mksnapshot
 47M	out/arm64.release/d8
 38M	out/arm64.release/obj/v8_turboshaft/csa-optimize-phase.o
 24M	out/arm64.release/obj/v8_turboshaft/maglev-graph-building-phase.o
 22M	out/arm64.release/obj/v8_base_without_compiler/elements.o
 20M	out/arm64.release/obj/v8_compiler/pipeline.o
 19M	out/arm64.release/obj/v8_turboshaft/store-store-elimination-phase.o
 19M	out/arm64.release/obj/v8_turboshaft/machine-lowering-phase.o
 19M	out/arm64.release/obj/v8_base_without_compiler/maglev-graph-builder.o
 18M	out/arm64.release/obj/v8_compiler/wasm-optimize-phase.o
 18M	out/arm64.release/obj/v8_base_without_compiler/turboshaft-graph-interface.o
 17M	out/arm64.release/obj/v8_turboshaft/type-assertions-phase.o
 17M	out/arm64.release/obj/v8_turboshaft/optimize-phase.o
 16M	out/arm64.release/obj/v8_turboshaft/typed-optimizations-phase.o
 16M	out/arm64.release/obj/v8_turboshaft/pipelines.o
 16M	out/arm64.release/obj/v8_turboshaft/loop-unrolling-phase.o
 16M	out/arm64.release/obj/v8_turboshaft/loop-peeling-phase.o
 16M	out/arm64.release/obj/v8_turboshaft/code-elimination-and-simplification-phase.o
 16M	out/arm64.release/obj/v8_base_without_compiler/wasm-dead-code-elimination-phase.o
 16M	out/arm64.release/obj/v8_base_without_compiler/maglev-compiler.o
 16M	out/arm64.release/obj/v8_base_without_compiler/api.o
 15M	out/arm64.release/obj/v8_turboshaft/graph-builder.o
 15M	out/arm64.release/obj/v8_turboshaft/block-instrumentation-phase.o
 15M	out/arm64.release/obj/v8_compiler/wasm-lowering-phase.o
 15M	out/arm64.release/obj/v8_compiler/wasm-gc-optimize-phase.o
 14M	out/arm64.release/obj/v8_base_without_compiler/maglev-ir.o
 13M	out/arm64.release/obj/v8_base_without_compiler/mark-compact.o
 12M	out/arm64.release/obj/v8_turboshaft/simplified-lowering-phase.o
 12M	out/arm64.release/obj/v8_initializers/code-stub-assembler.o
 12M	out/arm64.release/obj/v8_base_without_compiler/objects.o
 12M	out/arm64.release/obj/v8_base_without_compiler/maglev-code-generator.o
 12M	out/arm64.release/obj/v8_base_without_compiler/liftoff-compiler.o
 12M	out/arm64.release/obj/torque_base/implementation-visitor.o
 11M	out/arm64.release/obj/v8_compiler/instruction-selector-arm64.o
 11M	out/arm64.release/obj/v8_base_without_compiler/maglev-graph-printer.o
 11M	out/arm64.release/obj/v8_base_without_compiler/heap.o
 10M	out/arm64.release/obj/v8_turboshaft/late-load-elimination-reducer.o
 10M	out/arm64.release/obj/v8_compiler/instruction-selector.o
 10M	out/arm64.release/obj/v8_base_without_compiler/isolate.o
 10M	out/arm64.release/obj/v8_base_without_compiler/builtins-temporal.o
 10M	out/arm64.release/obj/torque_base/torque-parser.o
 10M	out/arm64.release/icudtl.dat
9.9M	out/arm64.release/obj/v8_compiler/js-call-reducer.o
9.9M	out/arm64.release/obj/d8/d8.o
9.8M	out/arm64.release/obj/v8_turboshaft/recreate-schedule.o
9.7M	out/arm64.release/obj/v8_base_without_compiler/wrappers.o
9.7M	out/arm64.release/obj/v8_base_without_compiler/factory.o
9.5M	out/arm64.release/obj/v8_compiler/wasm-compiler.o
9.5M	out/arm64.release/obj/v8_base_without_compiler/objects-printer.o
9.5M	out/arm64.release/obj/v8_base_without_compiler/concurrent-marking.o

@targos
Copy link
Member

targos commented Sep 9, 2024

I disabled node-test-commit-osx from https://ci.nodejs.org/job/node-test-commit/configure:

CleanShot 2024-09-10 at 00 07 03@2x

nodejs-github-bot pushed a commit to nodejs/node that referenced this issue Sep 11, 2024
The test runner's code coverage leaves old coverage data in the
temp directory. This commit updates the cleanup logic to:

- Stop code collection. Otherwise V8 would write collection data
  again when the process exits.
- Remove the temp directory containing the coverage data.
- Attempt to clean up the coverage data even if parsing the
  data resulted in an error.

With this change, I no longer see any coverage data left behind
in the system temp directory.

Refs: nodejs/build#3864
Refs: nodejs/build#3887
PR-URL: #54856
Reviewed-By: Yagiz Nizipli <[email protected]>
Reviewed-By: Jake Yuesong Li <[email protected]>
Reviewed-By: Luigi Pinca <[email protected]>
Reviewed-By: Moshe Atlow <[email protected]>
Reviewed-By: James M Snell <[email protected]>
aduh95 pushed a commit to nodejs/node that referenced this issue Sep 12, 2024
The test runner's code coverage leaves old coverage data in the
temp directory. This commit updates the cleanup logic to:

- Stop code collection. Otherwise V8 would write collection data
  again when the process exits.
- Remove the temp directory containing the coverage data.
- Attempt to clean up the coverage data even if parsing the
  data resulted in an error.

With this change, I no longer see any coverage data left behind
in the system temp directory.

Refs: nodejs/build#3864
Refs: nodejs/build#3887
PR-URL: #54856
Reviewed-By: Yagiz Nizipli <[email protected]>
Reviewed-By: Jake Yuesong Li <[email protected]>
Reviewed-By: Luigi Pinca <[email protected]>
Reviewed-By: Moshe Atlow <[email protected]>
Reviewed-By: James M Snell <[email protected]>
aduh95 pushed a commit to nodejs/node that referenced this issue Sep 13, 2024
The test runner's code coverage leaves old coverage data in the
temp directory. This commit updates the cleanup logic to:

- Stop code collection. Otherwise V8 would write collection data
  again when the process exits.
- Remove the temp directory containing the coverage data.
- Attempt to clean up the coverage data even if parsing the
  data resulted in an error.

With this change, I no longer see any coverage data left behind
in the system temp directory.

Refs: nodejs/build#3864
Refs: nodejs/build#3887
PR-URL: #54856
Reviewed-By: Yagiz Nizipli <[email protected]>
Reviewed-By: Jake Yuesong Li <[email protected]>
Reviewed-By: Luigi Pinca <[email protected]>
Reviewed-By: Moshe Atlow <[email protected]>
Reviewed-By: James M Snell <[email protected]>
aduh95 pushed a commit to nodejs/node that referenced this issue Sep 13, 2024
The test runner's code coverage leaves old coverage data in the
temp directory. This commit updates the cleanup logic to:

- Stop code collection. Otherwise V8 would write collection data
  again when the process exits.
- Remove the temp directory containing the coverage data.
- Attempt to clean up the coverage data even if parsing the
  data resulted in an error.

With this change, I no longer see any coverage data left behind
in the system temp directory.

Refs: nodejs/build#3864
Refs: nodejs/build#3887
PR-URL: #54856
Reviewed-By: Yagiz Nizipli <[email protected]>
Reviewed-By: Jake Yuesong Li <[email protected]>
Reviewed-By: Luigi Pinca <[email protected]>
Reviewed-By: Moshe Atlow <[email protected]>
Reviewed-By: James M Snell <[email protected]>
targos pushed a commit to nodejs/node that referenced this issue Sep 22, 2024
The test runner's code coverage leaves old coverage data in the
temp directory. This commit updates the cleanup logic to:

- Stop code collection. Otherwise V8 would write collection data
  again when the process exits.
- Remove the temp directory containing the coverage data.
- Attempt to clean up the coverage data even if parsing the
  data resulted in an error.

With this change, I no longer see any coverage data left behind
in the system temp directory.

Refs: nodejs/build#3864
Refs: nodejs/build#3887
PR-URL: #54856
Reviewed-By: Yagiz Nizipli <[email protected]>
Reviewed-By: Jake Yuesong Li <[email protected]>
Reviewed-By: Luigi Pinca <[email protected]>
Reviewed-By: Moshe Atlow <[email protected]>
Reviewed-By: James M Snell <[email protected]>
targos pushed a commit to nodejs/node that referenced this issue Sep 26, 2024
The test runner's code coverage leaves old coverage data in the
temp directory. This commit updates the cleanup logic to:

- Stop code collection. Otherwise V8 would write collection data
  again when the process exits.
- Remove the temp directory containing the coverage data.
- Attempt to clean up the coverage data even if parsing the
  data resulted in an error.

With this change, I no longer see any coverage data left behind
in the system temp directory.

Refs: nodejs/build#3864
Refs: nodejs/build#3887
PR-URL: #54856
Reviewed-By: Yagiz Nizipli <[email protected]>
Reviewed-By: Jake Yuesong Li <[email protected]>
Reviewed-By: Luigi Pinca <[email protected]>
Reviewed-By: Moshe Atlow <[email protected]>
Reviewed-By: James M Snell <[email protected]>
@ryanaslett
Copy link
Contributor

This was a symptom of having long lived OSX runners, which will soon be fixed by our transition to ephemeral Orka macos runners.

In the meantime, the disk was filling because macos spotlight indexing was indexing the builds , and creating new, unique uuid to filename mappings, which was filling up /private/var/db/uuidtext

I've disabled spotlight, and removed the spotlight databases.

We've now got 31GB free with a workspace on test-orka-macos11-x64-1
image

And 52 GB free on test-orka-macos11-x64-2

image

both of the orka-macos10.15-x64 machines have over 50GB available as well.

We should be able to re-enable these now, and they should last until they are replaced, shortly.

targos pushed a commit to nodejs/node that referenced this issue Oct 2, 2024
The test runner's code coverage leaves old coverage data in the
temp directory. This commit updates the cleanup logic to:

- Stop code collection. Otherwise V8 would write collection data
  again when the process exits.
- Remove the temp directory containing the coverage data.
- Attempt to clean up the coverage data even if parsing the
  data resulted in an error.

With this change, I no longer see any coverage data left behind
in the system temp directory.

Refs: nodejs/build#3864
Refs: nodejs/build#3887
PR-URL: #54856
Reviewed-By: Yagiz Nizipli <[email protected]>
Reviewed-By: Jake Yuesong Li <[email protected]>
Reviewed-By: Luigi Pinca <[email protected]>
Reviewed-By: Moshe Atlow <[email protected]>
Reviewed-By: James M Snell <[email protected]>
targos pushed a commit to nodejs/node that referenced this issue Oct 2, 2024
The test runner's code coverage leaves old coverage data in the
temp directory. This commit updates the cleanup logic to:

- Stop code collection. Otherwise V8 would write collection data
  again when the process exits.
- Remove the temp directory containing the coverage data.
- Attempt to clean up the coverage data even if parsing the
  data resulted in an error.

With this change, I no longer see any coverage data left behind
in the system temp directory.

Refs: nodejs/build#3864
Refs: nodejs/build#3887
PR-URL: #54856
Reviewed-By: Yagiz Nizipli <[email protected]>
Reviewed-By: Jake Yuesong Li <[email protected]>
Reviewed-By: Luigi Pinca <[email protected]>
Reviewed-By: Moshe Atlow <[email protected]>
Reviewed-By: James M Snell <[email protected]>
@ryanaslett
Copy link
Contributor

All of the legacy OSX machines have been running successfully now for about 36 hours. The build results are very flappy between green and yellow status, and almost every yellow status is one particular test that keeps flagging as flaky:
image

Im not sure if that test is specifically flaky on OSX or if thats a problematic test in general, but its the vast majority of flaky results for the current OSX builds.

louwers pushed a commit to louwers/node that referenced this issue Nov 2, 2024
The test runner's code coverage leaves old coverage data in the
temp directory. This commit updates the cleanup logic to:

- Stop code collection. Otherwise V8 would write collection data
  again when the process exits.
- Remove the temp directory containing the coverage data.
- Attempt to clean up the coverage data even if parsing the
  data resulted in an error.

With this change, I no longer see any coverage data left behind
in the system temp directory.

Refs: nodejs/build#3864
Refs: nodejs/build#3887
PR-URL: nodejs#54856
Reviewed-By: Yagiz Nizipli <[email protected]>
Reviewed-By: Jake Yuesong Li <[email protected]>
Reviewed-By: Luigi Pinca <[email protected]>
Reviewed-By: Moshe Atlow <[email protected]>
Reviewed-By: James M Snell <[email protected]>
tpoisseau pushed a commit to tpoisseau/node that referenced this issue Nov 21, 2024
The test runner's code coverage leaves old coverage data in the
temp directory. This commit updates the cleanup logic to:

- Stop code collection. Otherwise V8 would write collection data
  again when the process exits.
- Remove the temp directory containing the coverage data.
- Attempt to clean up the coverage data even if parsing the
  data resulted in an error.

With this change, I no longer see any coverage data left behind
in the system temp directory.

Refs: nodejs/build#3864
Refs: nodejs/build#3887
PR-URL: nodejs#54856
Reviewed-By: Yagiz Nizipli <[email protected]>
Reviewed-By: Jake Yuesong Li <[email protected]>
Reviewed-By: Luigi Pinca <[email protected]>
Reviewed-By: Moshe Atlow <[email protected]>
Reviewed-By: James M Snell <[email protected]>
@RafaelGSS
Copy link
Member Author

Can we close it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants