1/*
2\file
3\brief silence !
4*/
5
6/*!
7
8\defgroup webgl_grp WebGL API
9\ingroup jsapi_grp
10\brief WebGL API.
11
12# Foreword
13GPAC supports the WebGL 1.0 Core API. For more documentation, please check https://www.khronos.org/registry/webgl/specs/latest/1.0
14
15The WebGL API cannot currently be loaded when using SVG or VRML scripts. It is only available for JSFilter, and shall be loaded as a JS module with the name "webgl":
16
17\code
18import * as webgl from 'webgl'
19
20...
21\endcode
22
23or
24
25\code
26import {WebGLContext} from 'webgl'
27
28...
29\endcode
30
31The API implements most of WebGL 1.0 context calls. What is not supported:
32- premultiplied alpha (gl.GL_UNPACK_PREMULTIPLY_ALPHA_WEBGL & co)
33- flip (gl.UNPACK_FLIP_Y_WEBGL), this must be done in the shader
34- WebGL extensions (gl.getExtension)
35
36# WebGL Context
37
38The WebGL API in GPAC does not use any canvas element, since it is designed to run outside of a DOM. The WebGL context shall therefore be created using a constructor call.
39
40A filter is responsible for deciding when to issue GL calls: this can be in a process function or in a task callback (see  \ref JSFilter.post_task). There is no such thing as requestAnimationFrame in GPAC. Consequently, the owning filter is responsible for:
41- activating and deactivating the context in order to make the associated GL context active and bind / unbind the underlying framebuffer
42- resizing the framebuffer when needed
43
44\warning it is unsafe to assume that your filter is owning the OpenGL context, there may be other filters operating on the context. This means that context state (viewport, clearColor, etc...) shall be restored when reactivating the context.
45\note WebGL filters always run on the main process to avoid concurrent usage of the OpenGL context, but this might change in the future.
46
47The WebGL API in GPAC work by default on offscreen framebuffer objects:
48- the color attachment is always a texture object, RGBA 32 bits or RGB 24 bits (see WebGLContextAttributes)
49- the depth attachment is a renderbuffer by default and cannot be exported; this behaviour can be changed by setting the "depth" attribute of the WebGLContextAttributes object to "texture" before creation, thereby creating a depth texture attachment; the format is 24 bit precision integer (desktop) or 16 bit precision integer (iOS, Android).
50
51The underlying framebuffer color texture attachment and if enabled, depth texture attachment, can be dispatched as a GPAC packet using \ref FilterPid.new_packet; this allows forwarding a framebuffer data to other filters without having to copy to system memory the framebuffer content.
52
53When forwarding a framebuffer, it is recommended not to draw anything nor activate GL context until all references of the packet holding the framebuffer are consummed. A callback function is used for that, see example below.
54
55\note you can always use glReadPixels to read back the framebuffer and send packets using the usual FilterPacket tools.
56
57The WebGL API in GPAC can also be configured to run on the primary frame buffer; this is achieved by adding a "primary" attribute set to true to the WebGLContextAttributes object at creation. In this case:
58- depth buffer cannot be delivered as a texture
59- video output SHALL be created before WebGL context creation (typically by loading the video output filter before the JS filter)
60
61
62
63# Texturing
64
65WebGL offer two ways of creating textures:
66- regular texImage2D using ArrayBuffers created from the script. This is obviously supported in GPAC.
67- texImage2D from TexImageSource
68
69TexImageSource in WebGL can be ImageBitmap, ImageData, HTMLImageElement, HTMLCanvasElement, HTMLVideoElement or OffscreenCanvas.
70
71Since GPAC doesn't run WebGL in a DOM/browser, these objects are not available for creating textures. Instead, the following objects can be used:
72- EVG Texture
73- NamedTexture
74
75## Using EVG textures
76
77EVG Texture can be used to quickly load JPG or PNG images:
78\code
79let texture = gl.createTexture();
80let tx = new evg.Texture('source.jpg');
81gl.bindtexture(gl.TEXTURE_2D, texture);
82gl.texImage2D(target, level, internalformat, format, type, tx);
83//at this point the data is uploaded on GPU, the EVG texture is no longer needed and can be GC'ed
84\endcode
85
86EVG Texture combined with EVG Canvas can be used to draw text and 2D shapes:
87\code
88let canvas = new evg.Canvas(200, 200, 'rgba');
89/* draw stuff on canvas
90...
91*/
92let texture = gl.createTexture();
93let tx = new evg.Texture(canvas);
94gl.bindtexture(gl.TEXTURE_2D, texture);
95gl.texImage2D(target, level, internalformat, format, type, tx);
96//at this point the data is uploaded on GPU, the EVG texture and canvas are no longer needed and can be GC'ed
97\endcode
98
99For more info on drawing with EVG, see \ref jsevg_grp
100## Using named textures
101
102Dealing with pixel formats in OpenGL/WebGL/GLSL can be quite heavy:
103- some pixel formats come in various component ordering, where only a few subset are natively supported (eg RGBA is OK, but BGRA is not)
104- some pixel formats are not natively supported by OpenGL/WebGL (typically, most/all flavors of YUV)
105- some pixel formats are planar and require more than one texture to draw them, which is quite heavy to setup
106- some video decoders might output directly as a set of one or more OpenGL textures on the GPU (NVDec, iOS VideoToolbox, Android MediaCodec)
107
108In order to simplify your code and deal efficiently with most formats, the WebGL API in GPAC introduces the concept of named textures.
109
110
111A named texture is a texture created with a name:
112\code
113let tx = gl.createTexture('myVidTex');
114\endcode
115
116The texture data is then associated using upload():
117\code
118//source data is in system memory or already in OpenGL textures
119let pck = input_pid.get_packet();
120tx.upload(pck);
121
122//or
123
124//source data is only in system memory
125tx.upload(some_evg_texture);
126\endcode
127
128Regular bindTexture and texImage2D can also be used if you don't like changing your code too much:
129\code
130let pck = input_pid.get_packet();
131gl.bindTexture(gl.TEXTURE_2D, tx);
132//source data is in system memory or already in OpenGL textures
133gl.texImage2D(target, level, internalformat, format, type, pck);
134
135//or
136
137gl.bindTexture(gl.TEXTURE_2D, tx);
138//source data is only in system memory
139gl.texImage2D(target, level, internalformat, format, type, some_evg_texture);
140\endcode
141
142The magic comes in when creating your shaders: any call to texture2D on a sampler2D using the same name as the NamedTexture is rewritten before compilation and replaced with GLSL code handling the pixel format conversion for you !
143\code
144varying vec2 vTextureCoord;
145uniform sampler2D myVidTex; //this will get replaced before compilation
146uniform sampler2D imageSampler; //this will NOT get replaced
147void main(void) {
148  vec2 tx = vTextureCoord;
149  vid = texture2D(myVidTex, tx); //this will get replaced before compilation
150  img = texture2D(imageSampler, tx); //this will NOT get replaced
151  vid.alpha = img.alpha;
152  gl_FragColor = vid;
153}
154\endcode
155
156The resulting fragment shader may contain one or more sampler2D and a few additional uniforms, but they are managed for you by GPAC!
157
158The named texture is then used as usual:
159\code
160  gl.activeTexture(gl.TEXTURE0);
161  gl.bindTexture(gl.TEXTURE_2D, tx);
162  //this one is ignored for named textures (the uniformlocation object exists but is deactivated) but you can just keep your code as usual
163  gl.uniform1i(myVidTexUniformLocation, 0);
164
165  gl.activeTexture(gl.TEXTURE0 + tx.nb_textures);
166  gl.bindTexture(gl.TEXTURE_2D, imageTexture);
167  gl.uniform1i(imageSamplerUniformLocation, 0);
168\endcode
169
170In the above code, note the usage of tx.nb_textures : this allows fetching the underlying number of texture units used by the named texture, and properly setting up multitexturing.
171
172\warning A consequence of this is that you cannot reuse a fragment shader for both a NamedTexture and a regular WebGLTexture, this will simply not work.
173
174\warning Using explicit location assignment in your shader on a named texture sampler2D is NOT supported: \code layout(location = N) \endcode
175
176
177The core concept for dealing with NamedTexture is that the fragment shader sources must be set AFTER the texture is being setup (upload / texImage2D). Doing it before will result in an unmodifed fragment shader and missing uniforms.
178
179To summarize, NamedTexture allows you to use existing glsl fragment shaders sources with any pixel format for your source, provided that:
180- you tag the texture with the name of the sampler2D you want to replace
181- you upload data to your texture before creating the program using it
182
183The NamedTexture does not track any pixel format or image width changes, mostly because the program needs recompiling anyway. This means that whenever the pixel format or source image width change for a NamedTexture, you must:
184- reset the NamedTexture by calling reconfigure()
185- destroy your GLSL program,
186- upload the new data to your NamedTexture
187- resetup your fragment shader source and program
188
189\note The width must be checked, since for packed YUV it is needed and exposed as a uniform. You could also modify this uniform manually, see "Inside named textures" section
190
191## Example of using source FilterPacket with NamedTexture
192
193\code
194import {WebGLContext, Matrix} from 'webgl'
195
196//let our filter accept and produce only raw video
197filter.set_cap({id: "StreamType", value: "Video", inout: true} );
198filter.set_cap({id: "CodecID", value: "raw", inout: true} );
199
200//setup webGL
201let gl = new WebGLContext(1280, 720);
202//setup named texture
203let tx = gl.createTexture('MyVid');
204let program = null;
205
206let width=0;
207let height=0;
208let pix_fmt = '';
209let ipid = null;
210let opid = null;
211
212const vertexShaderSource = `
213...
214`;
215
216const fragmentShaderSource = `
217varying vec2 vTextureCoord;
218uniform sampler2D MyVid; //same as our named texture
219void main(void) {
220  vec2 tx= vTextureCoord;
221  //vertical flip
222  tx.y = 1.0 - tx.y;
223  gl_FragColor = texture2D(MyVid, tx);
224}
225`;
226
227filter.configure_pid = function(pid) {
228  if (!opid) {
229    opid = this.new_pid();
230  }
231  ipid = pid;
232  //copy all props from input pid
233  opid.copy_props(pid);
234  //default pixel format for WebGL context framebuffer is RGBA
235  opid.set_prop('PixelFormat', 'rgba');
236  //drop these properties
237  opid.set_prop('Stride', null);
238  opid.set_prop('StrideUV', null);
239  //check if pixel format, width or height have changed by checking props
240  let n_width = pid.get_prop('Width');
241  let n_height = pid.get_prop('Height');
242  let pf = pid.get_prop('PixelFormat');
243  if ((n_width != width) || (n_height != height)) {
244    width = n_width;
245    height = n_height;
246    //you may want to resize your canvas here
247  }
248  if (pf != pix_fmt) {
249    pix_fmt = pf;
250    //dereference program (wait gor GC to kill it) or delete it using gl.deleteProgram
251    program = null;
252    //notify the texture it needs reconfiguring
253    tx.reconfigure();
254  }
255}
256
257filter.process = function()
258{
259  //previous frame is still being used by output(s), do not modify (although you technically can ...) !
260  if (filter.frame_pending) return GF_OK;
261  //get source packet
262  let ipck = ipid.get_packet();
263  if (!ipck) return GF_OK;
264
265  //request the OpenGL context to be the current one
266  gl.activate(true);
267
268  //upload texture - these are the same as tx.upload(ipck);
269  gl.bindTexture(gl.TEXTURE_2D, tx);
270  gl.texImage2D(gl.TEXTURE_2D, 0, 0, 0, 0, ipck);
271
272  //program not created, do it now that we know the texture format
273  if (!programInfo) programInfo = setupProgram(gl, vertexShaderSource, fragmentShaderSource);
274  /*draw scene
275  setup viewport, matrices, uniforms, etc.
276  ...
277  */
278  //set video texture
279  gl.activeTexture(gl.TEXTURE0);
280  gl.bindTexture(gl.TEXTURE_2D, tx);
281  //this one is ignored for gpac named textures, just kept to make sure we don't break usual webGL programming
282  gl.uniform1i(programInfo.uniformLocations.txVid, 0);
283
284  /*
285  ...
286
287  drawElements / drawArray ...
288
289  end draw scene
290  */
291  //make sure all OpenGL calls are done before sending the packet
292  gl.flush();
293
294  //indicate we are done with the OpenGL context
295  gl.activate(false);
296
297  //create packet from webgl framebuffer, with a callback to get notified when the frambuffer is no longer in use by other filters
298  let opck = opid.new_packet(gl, () => { filter.frame_pending=false; } );
299
300  //remember we wait for the notif
301  this.frame_pending = true;
302  //copy all properties of the source packet
303  opck.copy_props(ipck);
304
305  //note that we drop the source after the draw in this example: since the source data could be OpenGL textures, we don't want to discard them until we are done
306  ipid.drop_packet();
307
308  //send packet !
309  opck.send();
310}
311\endcode
312
313## Inside named textures
314
315NamedTexture allows supporting all pixel formats currently used in GPAC without any conversion before GPU upload. Namely:
316- YUV 420, 422 and 444 planar 8 bits (and 10 bits on desktop versions)
317- YUYV, YVYU, UYVU, VYUY 422 8 bits
318- NV12 and NV21 8 bits  (and 10 bits on desktop versions)
319- RGBA, ARGB, BGRA, ABGR, RGBX, XRGB, BGRX, XBGR
320- AlphaGrey and GreyAlpha
321- Greyscale
322- RGB 444, RGB 555, RGB 565
323
324If you want to have fun, the underlying uniforms are defined in the fragment shader, with $NAME$ being replaced by the name of the NamedTexture:
325-  uniform sampler2D \_gf\_$NAME$\_1: RGB (all variants), packed YUV (all variants) or Y plane, always defined
326- uniform sampler2D \_gf\_$NAME$\_2: U or UV plane, if any, undefined otherwise
327- uniform sampler2D \_gf\_$NAME$\_3: V plane, if any, undefined otherwise
328- uniform float \_gf\_$NAME$\_width: image width for packed YUV, undefined otherwise
329
330The texture formats are as follows:
331- RGB 444, RGB 555, RGB 565 are uploaded as alpha grey images
332- nv12 and nv21 are uploaded as greyscale image for Y and alpha grey image for UV
333- all planar formats are uploaded as one greyscale image per plane
334- All 10 bit support is done using 16 bits texture, GL_UNSIGNED_SHORT format and GL_RED_SCALE/GL_ALPHA_SCALE
335
336\note Currently 10 bit support is disabled on iOS and Android since GL_RED_SCALE/GL_ALPHA_SCALE are not supported in GLES2
337
338The YUV to RGB conversion values are currently hardcoded, we will expose them as uniforms soon.
339The YUV+alpha is yet to be implemented.
340
341
342# Matrices
343The 'evg' module comes with a Matrix object to avoid external dependencies for matrix manipulation.
344@{
345
346*/
347
348/*! Extensions for GPAC WebGL*/
349interface WebGLContext : implements WebGLRenderingContextBase {
350
351	/*! creates a new WebGL context
352	\param width the target width in pixels of the drawing buffer
353	\param height the target height in pixels of the drawing buffer
354	\param context_attributes the context attributes as defined by WebGL (see https://www.khronos.org/registry/webgl/specs/latest/1.0/#5.2)
355	*/
356	WebGLContext(unsigned long width, unsigned long height, WebGLContextAttributes context_attributes);
357
358	/*! creates a new WebGL context (mainly defined for future canvas simulation)
359	\param canvas_obj an object exposing "width" and "height" properties
360	\param context_attributes the context attributes as defined by WebGL (see https://www.khronos.org/registry/webgl/specs/latest/1.0/#5.2)
361	*/
362	WebGLContext(Object canvas_obj, WebGLContextAttributes context_attributes);
363
364
365	/*! activate or deactivate a WebGL context
366	\param activate if true, binds the associated frame buffer. If false, unbinds it
367	*/
368	void activate(boolean activate);
369
370
371	/*! resize the underlying frame buffer to the indicated size
372	\param width new width in pixels
373	\param height new height in pixels
374	*/
375	void resize(unsigned long width, unsigned long height);
376
377	/*! uploads the content of the EVG Texture to the bound texture. The bound texture can be a WebGLTexture or a NamedTexture
378	\param target ignored, default to gl.TEXTURE_2D
379	\param level target same as regular texImage2D
380	\param internalformat ignored, overloaded during upload based on input data
381	\param format ignored, overloaded during upload based on input data
382	\param type ignored, overloaded during upload based on input data
383	\param source the source Texture to use
384	*/
385	void texImage2D(GLenum target, GLint level, GLint internalformat, GLenum format, GLenum type, Texture source);
386
387	/*! uploads the content of the FilterPacket to the bound texture. The bound texture shall be a NamedTexture
388	\param target ignored, default to gl.TEXTURE_2D
389	\param level target same as regular texImage2D
390	\param internalformat ignored, overloaded during upload based on input data
391	\param format ignored, overloaded during upload based on input data
392	\param type ignored, overloaded during upload based on input data
393	\param source the source FilterPacket to use
394	*/
395	void texImage2D(GLenum target, GLint level, GLint internalformat, GLenum format, GLenum type, FilterPacket source);
396
397	/*! creates a named texture
398	\param name the name of the texture
399	\return a new named texture
400	*/
401	NamedTexture createTexture(DOMString name);
402
403	/*!
404	\param target ignored, default to gl.TEXTURE_2D
405	\param texture the named texture to bind, or null to unbind textures
406	*/
407	void bindTexture(GLenum target, NamedTexture texture);
408
409
410  /*!
411  \param use_gl_exts if true, queries all extensions supported by the underlying openGL implementation. Otherwise, queries only supported WebGL extensions (none at the moment)
412  \return an array of strings, each entry being the name of a supported extension
413  */
414  sequence<DOMString>? getSupportedExtensions(optional boolean use_gl_exts=false);
415};
416
417/*! Named texture object, see \ref webgl_grp*/
418interface NamedTexture {
419/*! number of underlying textures. This can be usefull when doing multi-texturing to get the next texture unit slot:
420\code
421nextActiveTexture = gl.TEXTURE0 + named_tx.nb_textures;
422\endcode
423*/
424attribute readonly unsigned long nb_textures;
425/*! set to true if the input to this named texture is a set of one or more OpenGL textures rather than system memory data*/
426attribute readonly unsigned long is_gl_input;
427/*! name of the texture, as passed upon creation*/
428attribute readonly DOMString name;
429/*! indicates if PBO is used for data transfer. By default named textures are created with no PBO transfer. To enable it, set this to true before the first texture upload*/
430attribute unsigned long pbo;
431
432/*! indicates the underlying picel format has been modified and that the texture should be reevaluated*/
433void reconfigure();
434/*! builds named texture from input filter packet
435\param pck the filter packet to use as source for texture data*/
436void upload(FilterPacket pck);
437/*! builds named texture from EVG texture
438\param tx the EVG texture to use as source for texture data
439\warning do NOT use a Texture object constructed from a FilterPacket, this will fail and throw an exception. Use upload(FilterPacket) instead*/
440void upload(Texture tx);
441
442};
443
444/*! The FilterPid object is extended as follows*/
445interface FilterPid {
446
447/*! creates a new output packet using the underlying texture attachement of the context as a texture source (see GF_FilterFrameInterface).
448\warning This will throw an error if called more than once on a given context but the associated packet has not been consumed yet!
449\param gl the WebGL context used to create the packet.
450\param on_frame_consumed a callback function notified when the associated packet has been consummed
451\param use_depth if set, uses the depth framebuffer attachment if enabled rather than the texture. See \ref WebGLContext
452\return new packet or null with exception*/
453FilterPacket new_packet(WebGLContext gl, function on_frame_consumed, optional boolean use_depth);
454
455};
456
457/*! @} */
458