Node-sass: Segmentation fault on Node 8 alpine (docker)


Works with node 6 and on my host machine ( debian, node 8 )

this is my project, can also try to make a smaller testcase if required

yarn build v0.24.6                                                                                                  
$ cross-env NODE_ENV=production npm run webpack                                                                     
npm info it worked if it ends with ok                                                                               
npm info using [email protected]                                                                                            
npm info using [email protected]                                                                                          
npm info lifecycle [email protected]~prewebpack: [email protected]                                                      
npm info lifecycle [email protected]~webpack: [email protected]                                                         

> [email protected] webpack /usr/src/app                                                                              
> webpack --progress --colors --config webpack.config.js                                                            

 12% building modules 20/21 modules 1 active ...!/usr/src/app/client/styles/main.scssSegmentation fault (core dumped
npm info lifecycle [email protected]~webpack: Failed to exec webpack script                                           
npm ERR! code ELIFECYCLE                                                                                            
npm ERR! errno 139                                                                                                  
npm ERR! [email protected] webpack: `webpack --progress --colors --config webpack.config.js`                          
npm ERR! Exit status 139                                                                                            
npm ERR!                                                                                                            
npm ERR! Failed at the [email protected] webpack script.                                                              
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.                  

npm ERR! A complete log of this run can be found in:                                                                
npm ERR!     /root/.npm/_logs/2017-06-26T22_44_11_188Z-debug.log                                                    
error Command failed with exit code 139.                                                                            
Removing intermediate container 5cb53913df5e
  • NPM version (npm -v):
  • Node version (node -v):
  • Node Process (node -p process.versions):
    { http_parser: '2.7.0',
    node: '8.1.2',
    v8: '',
    uv: '1.12.0',
    zlib: '1.2.11',
    ares: '1.10.1-DEV',
    modules: '57',
    openssl: '1.0.2l',
    icu: '59.1',
    unicode: '9.0',
    cldr: '31.0.1',
    tz: '2017b' }
  • Node Platform (node -p process.platform):
  • Node architecture (node -p process.arch):
  • node-sass version (node -p "require('node-sass').info"):
    node-sass 4.5.3 (Wrapper) [JavaScript]
    libsass 3.5.0.beta.2 (Sass Compiler) [C/C++]
ForsakenHarmony picture ForsakenHarmony  ·  27 Jun 2017

Most helpful comment


Segmentation fault is a type of crash to me

ForsakenHarmony picture ForsakenHarmony  ·  28 Jun 2017

All comments


Can you add the log to a Gist? I'm not seeing anything in the current output, besides that it failed around a Sass file, that indicates this is a node-sass issue.

I'd also suggest just trying to reproduce it without webpack

nschonni picture nschonni  ·  27 Jun 2017

I'm not quite sure what's going on, it only seems to happen in the docker build script, not if I run it inside the container

may also be my code, because this doesn't crash

FROM node:8-alpine

RUN mkdir -p /usr/app
WORKDIR /usr/app

RUN yarn add node-sass
ENV NODE_ENV production
RUN node -e "require('node-sass').renderSync({data: '* {height: 2px}'});"
ForsakenHarmony picture ForsakenHarmony  ·  28 Jun 2017

It does not seem to crash. Looks like a permission problem, people usually run as root inside a container, check our troubleshooting guide for info on unsafe permissions

saper picture saper  ·  28 Jun 2017

Segmentation fault is a type of crash to me

ForsakenHarmony picture ForsakenHarmony  ·  28 Jun 2017

I am sorry, didn't scroll that far right. Can you try to get the core file?

saper picture saper  ·  28 Jun 2017

Quick question, are project files copied from Debian?

saper picture saper  ·  28 Jun 2017

You'll need to delete your node_modules and reinstall your modules inside the container. Copying files from (or reusing a cache) between debian and alpine is going to result in segfaults.

xzyfer picture xzyfer  ·  30 Jun 2017

@saper the core file?
also yes, copied from debian
node_modules doesn't get copied ( doesn't exist )
I get the feeling it's the build itself

ForsakenHarmony picture ForsakenHarmony  ·  30 Jun 2017

probably your binding.node is copied/cached from Debian, this way it has to crash.

saper picture saper  ·  30 Jun 2017

it also happens in ci

ForsakenHarmony picture ForsakenHarmony  ·  30 Jun 2017

Having exact same issue. Fails with segmentation fault after a clean build and run inside Alpine image based on this Dockerfile (Node 8.1 + build essentials).

Works fine when doing the same thing inside image from this Dockerfile (Node 6.11 + build essentials).

mikestead picture mikestead  ·  2 Jul 2017

Between 6 and 8, the node images changed from Alpine 3.4 to 3.6 so there may be some difference there. We did use their image to build our binaries though

nschonni picture nschonni  ·  3 Jul 2017

@mikestead can you run npm rebuild node-sass --force within the Node 8.1 container and confirm whether the segfault is still present?

Also can you supply a minimal scss code sample the produces the segfault so we can investigate further?

xzyfer picture xzyfer  ·  3 Jul 2017

tried rebuilding in the Dockerfile, didn't change anything

ForsakenHarmony picture ForsakenHarmony  ·  3 Jul 2017

@ForsakenHarmony thanks. If a fresh local recompiled binary is still segfaulting I'm not sure what we can do. This could be an issue with Node 8 or Nan.

xzyfer picture xzyfer  ·  4 Jul 2017

FWIW I'm seeing something similar but nothing to do with node-sass.

I've had a node application running FROM node:8-alpine for a while now with no problems. I've been toying with using babel-preset-env to transpile code in a version-aware manner. Presumably this means that more of my code is now using native ES features that are available in node 8 features rather than transpiled ES5 code.

Anyway, something has changed and the app crashes with Segmentation Fault when I try to log in (works fine up until this point). Switching the Dockerfile back to FROM node:8 fixes the issue.

djskinner picture djskinner  ·  7 Jul 2017

this is the dockerfile I tested with, project is linked in the main post

ForsakenHarmony picture ForsakenHarmony  ·  7 Jul 2017

I am experiencing the same problem with a switch from Node 6 to 8.1.4 (alpine-based images).

Edit: I can confirm that not using an alpine-based image solves the problem (at least my problem).

willdurand picture willdurand  ·  13 Jul 2017

Can anyone provide a backtrace for this?

saper picture saper  ·  14 Jul 2017

I could not reproduce this bug on this example but in my case it happen too. This is a trace from my app

pawelpalka81 picture pawelpalka81  ·  18 Jul 2017

Having the same problem with Node 8.0.0 and Alpine, specifically this docker image: mhart/alpine-node:8.1.3. node-sass version 4.5.3. Nothing to really go off of in my logs from my CI except:

Segmentation fault (core dumped)
error Command failed with exit code 139.

This only happens on GitlabCI when using Docker in Docker. It works on my local MacOSX machine.

We had a similar problem when using bcrypt, see issue 528. Running npm rebuild node-sass on the CI does not help.

idmontie picture idmontie  ·  21 Jul 2017

Hello everyone participating in this bug: I have seen similar reports addressed to different projects using node alpine docker images. There is little evidence that node-sass is at fault here currently. (Just because something breaks with node-sass is not enough, this is only one of native modules that may cause trouble).

Unless somebody comes up with a working debugger and a stack trace it will be very hard to help and we will need to close this issue. Please keep in mind that node-sass is very picky about using the same versions of libc(musl) and C++ runtime libraries, which mostly means the same C++ compiler has to be used for node-sass and for the node itself.

Maybe the maintainers of the docker image can help you better, at least with getting the debugger running.

saper picture saper  ·  22 Jul 2017

Not on node 8, node 7 but....

would it be possible that the difference between what the global install does compare to local in term of sass binary in use...
Global on top, brought in by npm in the binding on bottom.

DianaOlympos picture DianaOlympos  ·  26 Jul 2017

@saper guess I should stop using sass then

ForsakenHarmony picture ForsakenHarmony  ·  26 Jul 2017

same here
with node8 and node-sass 4.5.3
with clean node:alpine

pandada8 picture pandada8  ·  31 Jul 2017

Has anyone tried the recommendation of adding in the libc6 compat from the Node image repo?

nschonni picture nschonni  ·  1 Aug 2017

@nschonni adding libc6-compat didn't work for me

jackbrown picture jackbrown  ·  2 Aug 2017

@jackbrown thanks for trying ❤️

nschonni picture nschonni  ·  2 Aug 2017

I managed to procure a stacktrace! I used the node8-alpine docker image. I used the following steps to reproduce:

docker run --security-opt seccomp:unconfined --cap-add SYS_PTRACE -it node8-alpine /bin/sh apk update && apk add git make g++ python2 gdb cd /tmp git clone cd node-sass yarn node scripts/build.js --force --debug gdb node set args bin/node-sass node_modules/sass-spec/spec/scss/huge/input.scss run bt

I have attached a backtrace that I got:


enko picture enko  ·  7 Aug 2017

This is great @enko. I think the root cause of the problem is

If that is confirmed, there is not much we can do about it.

See for example discussion

saper picture saper  ·  7 Aug 2017

Here is my workaround for the issue, and the image is top20/node:8-alpine.

jubel-han picture jubel-han  ·  27 Aug 2017

If the stackfix is doing what I think it is doing, is that a confirmation that we have a stack problem? Of course, it is possible that the newer libsass or the newer node need more stack for some reason, but until that does not loop forever it is not a bug.

That would mean closing this bug and declaring musl docker images unusable for node-sass.

saper picture saper  ·  27 Aug 2017

@jubel-han Thanks! Your workaround worked for me

johnwebbcole picture johnwebbcole  ·  1 Sep 2017

I experienced this issue, as well. I think is also related. My fix was to change the build stage's image to one based on alpine 3.7, which brings with it node 8. I also build force an npm rebuild of node-sass in that image. This resolves the issue for me.

carlodicelico picture carlodicelico  ·  24 Jan 2018
jwalton picture jwalton  ·  1 Feb 2018

@jwalton it is exactly this problem

saper picture saper  ·  1 Feb 2018

This was merged in August. I wonder when and where it landed?

jwalton picture jwalton  ·  1 Feb 2018

Different kind of fix could possibly be the following addition in your Dockerfiles.

I had the same issue not using Sass.
It only occurs if the docker host itself has a hardened linux kernel (alpine in my case).

cw789 picture cw789  ·  9 Feb 2018

This has been worked around by increasing thread stack size in the libuv libary ( and became available in node 8.

Therefore using anything older than node 8 with alpine is discouraged.

saper picture saper  ·  27 Feb 2018

In a nightcourse, we were asked to setup a node http server in docker.

I looked up what was good for docker and found it was alpine linux so I installed that in a VM.
Then I installed docker in that and then I followed a few basic tutorials to try and get node going.
None worked and I kept getting the error code 139.

Example 1:
Segmentation fault
The command '/bin/sh -c npm install' returned a non-zero code: 139

Example 2:
107a1b769d33 webapp "node index.js" 9 seconds ago Exited (139)

I searched for a very long time and eventually I found this thread and saw the reply from @cw789 which helped. It did not explain what I should do but I pieced it together from looking at the solution and how some others patched their systems.

The fix is for me to add this to my Dockerfile before npm gets called:

RUN apt-get update -y && apt-get dist-upgrade -y && \
apt-get install -y --no-install-recommends paxctl && \
paxctl -mC `which node`

It also works if I add it just after:
FROM node:carbon

If this comment seems long and includes a lot of technical lines, it is meant to. If anyone else ends up googling the errors like I did, I hope this helps.

paulmaraireland picture paulmaraireland  ·  18 Mar 2018

Can anyone confirm whether this is still an issue with 4.8.3?

xzyfer picture xzyfer  ·  24 Mar 2018

@xzyfer I confirm, 4.8.3 cause Segmentation fault (core dumped). On 4.8.2 we have no problems

xakep139 picture xakep139  ·  27 Mar 2018

I still have this issue on 4.10, with node:11.1-alpine

happysalada picture happysalada  ·  19 Nov 2018

I get it too on node:11.2-alpine and node:11.2-slim
if anybody else runs into this, using the full image
node:11.2 this happens less often (it still happens sometimes)

happysalada picture happysalada  ·  20 Nov 2018

using node:10.13-alpine seems to solve the problem

happysalada picture happysalada  ·  20 Nov 2018

I was facing the same problem serving an express server. For some reason -alpine version throw segmentation fault.

Using just node:8 solved my problem

testica picture testica  ·  18 May 2019

Using node:10.16.3 gave me seg fault, and changing it to node:8 or node:latest as of 2019 Sept 27th worked for me.

kozr picture kozr  ·  27 Sep 2019

using node:10.13-alpine seems to solve the problem

This resolves our problem.

superern picture superern  ·  10 Feb 2020