node.js - Deployment race condition causing CDN to cache old or broken files -
our current deploy process goes this:
- use
grunt
create production assets. create datestamp , point files @ our cdn (eg
/scripts/20140324142354/app.min.js
).sidenote: i've heard process called "versioning" before i'm not sure if it's proper term.
commit build github.
- run
git pull
on web servers retrieve new code github.
this node.js site , using forever -w
watch file changes , update site accordingly.
we have route setup in our app serve latest version of app via /scripts/*/app.min.js
.
the reason version because our cdn set cache javascript files indefinitely , purposely creates cache miss code updated on cdn (and in our users' browsers).
this works fine most of time. breaks down if 1 of servers lags bit in checking out new code.
sometimes client hits page while deploy in progress , tries retrieve new javascript code cdn. cdn tries retrieve hits server isn't finished checking out new code yet , caches old or partially downloaded file causing sorts of problems.
this problem exacerbated fact our cdn has many edge locations , problem isn't visible our office. edge locations may have pulled down old/bad code while others may have pulled down new/good code.
is there better way these deployments avoid issue?
step 4 in procedure should be:
git archive --remote $yourgithubrepo --prefix=$timestamp/ | tar -xf - stop-server ln -sf $timestamp current start-server
your server use current
directory (well, symlink) @ times. no matter how long deploy takes, application in consistent state.
Comments
Post a Comment