fix: properly handle child process stdio chunking#409
fix: properly handle child process stdio chunking#409VerteDinde merged 3 commits intoelectron:mainfrom
Conversation
Converting individual chunks from UTF-8 to JS strings is problematic because it does not handle UTF-8 characters that are split across chunks properly. Use the proper way of reading string data from streams instead.
|
@dsanders11 Just for context, I don’t have merge access here :) |
|
Is there a testcase for this? I need to port this to |
|
@malept This is a standard bugfix that has happened in many places before. It is even explicitly mentioned in the Node.js streams docs for
I can add a test case but it doesn’t seem that there are tests specifically for this file. This would be an example test case that could be added to your library, though: import spawnPromise from './spawn-promise';
const result = await spawnPromise('node', ['-e', 'process.stdout.write(Buffer.from([0xe5, 0xa5])); setTimeout(() => process.stdout.write(Buffer.from([0xbd])), 20)']);
expect(result).to.equal('好'); // adjust to your favorite assertion library(It may look contrived to write the bytes of a multibyte character individually. The more common form of this bug is that a character just happens to end up on a chunk boundary when a medium-to-large-size chunk of data is being written/read, but that’s less reliable to test because the chunk boundaries are not necessarily consistent across OSes, Node.js versions, etc., so a timer will generally do a better job for testing.) |
|
🎉 This PR is included in version 5.2.2 🎉 The release is available on: Your semantic-release bot 📦🚀 |
Converting individual chunks from UTF-8 to JS strings
is problematic because it does not handle UTF-8 characters
that are split across chunks properly.
Use the proper way of reading string data from streams instead.