This is assuming we are using a *nix based OS.
Scenario
We have a secure system that has no outside world connection, nothing, nada, zilch. Getting code on/off is done using a flash drive. (insert crying emoticon here).
Problem
We need to get our node app into the system and able to run npm script commands making sure all dependencies are met.
Location, location, location: installing modules manually
We can't use npm install
because npm will still attempt to connect over the network. You'll just be left hanging (however, later is a description of using npm offline).
When npm installs a package globally it copies to /usr/local/lib/node_modules
(or /usr/lib/node_modules
).
The NODE_PATH
environment variable should have the above path to the modules. If you want to change the default installation of global modules just change NODE_PATH
. So in your .bashrc file add:
export NODE_PATH=/usr/local/lib/node_modules
You might also need to add it to the /etc/environment
for it to cascade to all users:
export NODE_PATH=/usr/local/lib/node_modules
Command line packages
You have to create symlinks bin folders in order to run modules from the command line, browserify
, or mocha
for example.
ln -s usr/local/lib/node_modules/browserify/bin/browserify /usr/local/bin/browserify
Now you can run browserify
anywhere.
To verify global modules just run npm list -g
.
Deploying a tarball
To bundle into a tarball:
npm pack
Local
To have a local npm checkout local-npm