You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is not an issue but more of a general question regarding the cache.
Can I share the cache between concurrent runs? Say I have my control-repo, a bunch of private modules and a bunch of forge modules. Can I run multiple instances of g10k at the same time while using the same cachedir?
I looked quickly at the code and it doesn't seem to be any kind of lock except for mutexes?
Thanks again for all your hard work!
Regards,
Mathieu
The text was updated successfully, but these errors were encountered:
Yes, you can use the cachedir for multiple parallel g10k runs.
The pure access of the cache isn't problematic at all, but updating the cache in parallel could lead to race conditions, e.g. two g10k runs trying to download or extract the same Puppetlabs Forge archive or git pull the same git repository at the same time.
How does you setup look like? Just one control repo that you specify in a g10k config YAML? How are you calling g10k? With -config or do you use multiple -puppetfile calls at the same time.
Hello,
This is not an issue but more of a general question regarding the cache.
Can I share the cache between concurrent runs? Say I have my control-repo, a bunch of private modules and a bunch of forge modules. Can I run multiple instances of g10k at the same time while using the same cachedir?
I looked quickly at the code and it doesn't seem to be any kind of lock except for mutexes?
Thanks again for all your hard work!
Regards,
Mathieu
The text was updated successfully, but these errors were encountered: