The power of the web for illustrators: How pixiv uses web technologies for their drawing app

pixiv is an online community service for illustrators and illustration enthusiasts to communicate with each other through their content. It lets people post their own illustrations. They have over 84 million users across the globe, and more than 120 million art pieces posted as of May 2023.

pixiv Sketch is one of the services provided by pixiv. It's used to draw artworks on the website, using fingers or styluses. It supports a variety of features for drawing amazing illustrations including numerous types of brushes, layers, and bucket painting, and also allows people to livestream their drawing process.

In this case study, we'll take a look at how pixiv Sketch improved the performance and the quality of their web app by using some new web platform features like WebGL, WebAssembly, and WebRTC.

Why develop a sketching app on the web?

pixiv Sketch was first released on the web and on iOS in 2015. Their target audience for the web version was primarily the desktop, which still is the most major platform used by the illustration community.

Here are pixiv's top two reasons for choosing to develop a web version instead of a desktop app:

  • It is very costly to create apps for Windows, Mac, Linux, and more. The web reaches any browser on the desktop.
  • The web has the best reach across platforms. The web is available on desktop and mobile, and on every operating system.


pixiv Sketch has a number of different brushes for users to choose from. Before adopting WebGL, there was only one type of brush since the 2D canvas was too limited to depict the complex texture of different brushes, like coarse edges of a pencil and differing width and color intensity that changes on sketch pressure.

Creative types of brushes using WebGL

However, with the adoption of WebGL, they were able to add more varieties in brush details and increase the number of available brushes to seven.

The seven different brushes in pixiv ranging from fine to coarse, sharp to unsharp, pixelated to smooth, etc.

Using 2D canvas context, it was only possible to draw lines that have a simple texture with evenly distributed width, like the following screenshot:

Brush stroke with simple texture.

These lines were drawn by creating paths and drawing strokes, but WebGL reproduces this using point sprites and shaders, shown in the following code samples

The following example demonstrates a vertex shader.

precision highp float;

attribute vec2 pos;
attribute float thicknessFactor;
attribute float opacityFactor;

uniform float pointSize;

varying float varyingOpacityFactor;
varying float hardness;

// Calculate hardness from actual point size
float calcHardness(float s) {
  float h0 = .1 * (s - 1.);
  float h1 = .01 * (s - 10.) + .6;
  float h2 = .005 * (s - 30.) + .8;
  float h3 = .001 * (s - 50.) + .9;
  float h4 = .0002 * (s - 100.) + .95;
  return min(h0, min(h1, min(h2, min(h3, h4))));

void main() {
  float actualPointSize = pointSize * thicknessFactor;
  varyingOpacityFactor = opacityFactor;
  hardness = calcHardness(actualPointSize);
  gl_Position = vec4(pos, 0., 1.);
  gl_PointSize = actualPointSize;

The following example shows sample code for a fragment shader.

precision highp float;

const float strength = .8;
const float exponent = 5.;

uniform vec4 color;

varying float hardness;
varying float varyingOpacityFactor;

float fallOff(const float r) {
    // w is for width
    float w = 1. - hardness;
    if (w < 0.01) {
     return 1.;
    } else {
     return min(1., pow(1. - (r - hardness) / w, exponent));

void main() {
    vec2 texCoord = (gl_PointCoord - .5) * 2.;
    float r = length(texCoord);

    if (r > 1.) {

    float brushAlpha = fallOff(r) * varyingOpacityFactor * strength * color.a;

    gl_FragColor = vec4(color.rgb, brushAlpha);

The use of point sprites makes it straightforward to vary thickness and shading in response to drawing pressure, allowing the following strong and weak lines to be expressed, like these:

Sharp, even brush stroke with thin ends.

Unsharp brush stroke with more pressure applied in the middle.

In addition, implementations using point sprites can now attach textures by using a separate shader, allowing for efficient representation of brushes with textures such as pencil and felt-tip pen.

Stylus support on the browser

Using a digital stylus has become extremely popular for digital artists. Modern browsers support the PointerEvent API that enables users to use a stylus on their device: Use PointerEvent.pressureto measure pen pressure, and use PointerEvent.tiltX, PointerEvent.tiltY to measure the angle of the pen to the device.

In order to perform brush strokes with a point sprite, the PointerEvent must be interpolated and converted into a more fine-grained event sequence. In PointerEvent, the orientation of the stylus can be obtained in the form of polar coordinates, but pixiv Sketch converts them to a vector representing the orientation of the stylus before using them.

function getTiltAsVector(event: PointerEvent): [number, number, number] {
  const u = Math.tan((event.tiltX / 180) * Math.PI);
  const v = Math.tan((event.tiltY / 180) * Math.PI);
  const z = Math.sqrt(1 / (u * u + v * v + 1));
  const x = z * u;
  const y = z * v;
  return [x, y, z];

function handlePointerDown(event: PointerEvent) {
  const position = [event.clientX, event.clientY];
  const pressure = event.pressure;
  const tilt = getTiltAsVector(event);

  interpolateAndRender(position, pressure, tilt);

Multiple drawing layers

Layers are one of the most unique concepts in digital drawing. They let users draw different pieces of illustration on top of each other, and allow for edits layer by layer. pixiv Sketch provides layer functions much like other digital drawing apps do.

Conventionally, it is possible to implement layers by using several <canvas> elements with drawImage() and compositing operations. However this is problematic because with 2D canvas context, there is no other choice but to use CanvasRenderingContext2D.globalCompositeOperation composition mode, which is predefined and largely limits the scalability. By using WebGL and writing the shader, it allows the developers to use composition modes that are not predefined by the API. In the future, pixiv Sketch will implement the layer feature using WebGL for greater scalability and flexibility.

Here's the sample code for layer composition:

precision highp float;

uniform sampler2D baseTexture;
uniform sampler2D blendTexture;
uniform mediump float opacity;

varying highp vec2 uv;

// for normal mode
vec3 blend(const vec4 baseColor, const vec4 blendColor) {
  return blendColor.rgb;

// for multiply mode
vec3 blend(const vec4 baseColor, const vec4 blendColor) {
  return blendColor.rgb * blendColor.rgb;

void main()
  vec4 blendColor = texture2D(blendTexture, uv);
  vec4 baseColor = texture2D(baseTexture, uv);

  blendColor.a *= opacity;

  float a1 = baseColor.a * blendColor.a;
  float a2 = baseColor.a * (1. - blendColor.a);
  float a3 = (1. - baseColor.a) * blendColor.a;

  float resultAlpha = a1 + a2 + a3;

  const float epsilon = 0.001;

  if (resultAlpha > epsilon) {
    vec3 noAlphaResult = blend(baseColor, blendColor);
    vec3 resultColor =
        noAlphaResult * a1 + baseColor.rgb * a2 + blendColor.rgb * a3;
    gl_FragColor = vec4(resultColor / resultAlpha, resultAlpha);
  } else {
    gl_FragColor = vec4(0);

Large area painting with the bucket function

The pixiv Sketch iOS and Android apps already provided the bucket feature, but the web version did not. The app version of the bucket function was implemented in C++.

With the codebase already available in C++, pixiv Sketch used Emscripten and asm.js to implement the bucket function into the web version.


while (!bfsQueue.empty()) {
  Point point = bfsQueue.front();
  /* ... */

Using asm.js enabled a performant solution. Comparing the execution time of pure JavaScript versus asm.js, the execution time using asm.js is shortened by 67%. This is expected to be even better when using WASM.

Test details:

  • How: Paint 1180x800px area with bucket function
  • Test device: MacBook Pro (M1 Max)

Execution time:

  • Pure JavaScript: 213.8ms
  • asm.js: 70.3ms

Using Emscripten and asm.js, pixiv Sketch was able to successfully release the bucket feature by reusing the codebase from the platform-specific app version.

Live-streaming while drawing

pixiv Sketch offers the feature to live-stream while drawing, through the pixiv Sketch LIVE web app. This uses the WebRTC API, combining the microphone audio track obtained from getUserMedia() and the MediaStream video track retrieved from the <canvas> element.

const canvasElement = document.querySelector('#DrawCanvas');
const framerate = 24;
const canvasStream = canvasElement.captureStream(framerate);
const videoStreamTrack = canvasStream.getVideoTracks()[0];

const audioStream = await navigator.mediaDevices.getUserMedia({
  video: false,
  audio: {},
const audioStreamTrack = audioStream.getAudioTracks()[0];

const stream = new MediaStream();


With the power of new APIs like WebGL, WebAssembly and WebRTC, you can create a complex app on the web platform and scale it across any device. You can learn more about the technologies introduced in this case study at the following links: