在GLSL

如何解决在GLSL

我正在尝试在GSLS中精确地计算出点到线的距离-在turbo.js turbo.js

这是一个更普遍的问题的一部分,其中我试图找到对应于一组GeoJSON点的[GeoJSON多行上的最近点]-在1000条线段上设置500个点的计算结果最终是50万个点到距离的计算。

这在浏览器中(甚至在工作程序中)无法处理太多,因此并行性有很大帮助。

诀窍是AFAIK我只能使用vec4作为输入,这意味着我只能对成对的点进行计算。

到目前为止,我已经开始计算所有对的距离和方位-但是无法算出点到线的距离。

所以问题是-给a,b和c 3点,并知道

  • 他们在lon和lat中的位置
  • 他们的成对方位和距离

是否可以使用将vec2,vec3或vec4作为输入参数的转换来计算从a到b和c定义的线的距离?

作为一个子问题-我知道如果三角形(a,b,c)的高度不与线(a,b)相交时如何计算距离,因为它的min(distance(a,b) ,距离(a,c))。

但是,如何计算它是否相交?

解决方法

我不太确定我能理解你的问题。

听起来像您想知道500个输入点,1000个线段,每个点中哪个段最接近。

如果这是您要的内容,则将所有点放在浮点纹理中(纹理的另一个词是2D数组)。绘制一个-1到+1的四边形,该四边形是结果数的大小(500个结果,即50x10或25x20等。)传递纹理的分辨率。使用gl_FragCoord计算索引以获取输入A,并在所有其他行上循环。通过将最接近的对的索引编码为颜色,通过readPixels读取结果。

  precision highp float;

  uniform sampler2D aValues;
  uniform vec2 aDimensions;  // the size of the aValues texture in pixels (texels)
  uniform sampler2D bValues;
  uniform vec2 bDimensions;  // the size of the bValues texture in pixels (texels)
  uniform sampler2D cValues;
  uniform vec2 cDimensions;  // the size of the cValues texture in pixels (texels)
  uniform vec2 outputDimensions; // the size of the thing we're drawing to (canvas)

  // this code,given a sampler2D,the size of the texture,and an index
  // computes a UV coordinate to pull one RGBA value out of a texture
  // as though the texture was a 1D array.
  vec3 getPoint(in sampler2D tex,in vec2 dimensions,in float index) {
    vec2 uv = (vec2(
       floor(mod(index,dimensions.x)),floor(index / dimensions.x)) + 0.5) / dimensions;
    return texture2D(tex,uv).xyz;
  }

  // from https://stackoverflow.com/a/6853926/128511
  float distanceFromPointToLine(in vec3 a,in vec3 b,in vec3 c) {
    vec3 ba = a - b;
    vec3 bc = c - b;
    float d = dot(ba,bc);
    float len = length(bc);
    float param = 0.0;
    if (len != 0.0) {
      param = clamp(d / (len * len),0.0,1.0);
    }
    vec3 r = b + bc * param;
    return distance(a,r);
  }

  void main() {
    // gl_FragCoord is the coordinate of the pixel that is being set by the fragment shader.
    // It is the center of the pixel so the bottom left corner pixel will be (0.5,0.5).
    // the pixel to the left of that is (1.5,0.5),The pixel above that is (0.5,1.5),etc...
    // so we can compute back into a linear index 
    float ndx = floor(gl_FragCoord.y) * outputDimensions.x + floor(gl_FragCoord.x); 
    
    // find the closest points
    float minDist = 10000000.0; 
    float minIndex = -1.0;
    vec3 a = getPoint(aValues,aDimensions,ndx);
    for (int i = 0; i < ${bPoints.length / 4}; ++i) {
      vec3 b = getPoint(bValues,bDimensions,float(i));
      vec3 c = getPoint(cValues,cDimensions,float(i));
      float dist = distanceFromPointToLine(a,b,c);
      if (dist < minDist) {
        minDist = dist;
        minIndex = float(i);
      }
    }
    
    // convert to 8bit color. The canvas defaults to RGBA 8bits per channel
    // so take our integer index (minIndex) and convert to float values that
    // will end up as the same 32bit index when read via readPixels as
    // 32bit values.
    gl_FragColor = vec4(
      mod(minIndex,256.0),mod(floor(minIndex / 256.0),mod(floor(minIndex / (256.0 * 256.0)),floor(minIndex / (256.0 * 256.0 * 256.0))) / 255.0;
  }

我只想猜测一下,总的来说,这可以通过某种空间结构更好地解决,而这种结构可以以某种方式实现,因此您不必检查每一点的每一行,但是上面的代码应该可以正常工作平行。每个结果将由另一个GPU内核计算。

const v3 = twgl.v3;

// note: I'm using twgl to make the code smaller.
// This is not lesson in WebGL. You should already know what it means
// to setup buffers and attributes and set uniforms and create textures.
// What's important is the technique,not the minutia of WebGL. If you
// don't know how to do those things you need a much bigger tutorial
// on WebGL like https://webglfundamentals.org

function main() {
  const gl = document.createElement('canvas').getContext('webgl');
  const ext = gl.getExtension('OES_texture_float');
  if (!ext) {
    alert('need OES_texture_float');
    return;
  }
  
  const r = max => Math.random() * max;
  const hsl = (h,s,l) => `hsl(${h * 360},${s * 100 | 0}%,${l * 100 | 0}%)`;
  function createPoints(numPoints) {
    const points = [];
    for (let i = 0; i < numPoints; ++i) {
      points.push(r(300),r(150),0);  // RGBA
    }
    return points;
  }

  function distanceFromPointToLineSquared(a,c) {
    const ba = v3.subtract(a,b);
    const bc = v3.subtract(c,b);
    const dot = v3.dot(ba,bc);
    const lenSq = v3.lengthSq(bc);
    let param = 0;
    if (lenSq !== 0) {
      param = Math.min(1,Math.max(0,dot / lenSq));
    }
    const r = v3.add(b,v3.mulScalar(bc,param));
    return v3.distanceSq(a,r);
  }

  const aPoints = createPoints(6);
  const bPoints = createPoints(15);
  const cPoints = createPoints(15);
  
  // do it in JS to check
  {
    // compute closest lines to points
    const closest = [];
    for (let i = 0; i < aPoints.length; i += 4) {
      const a = aPoints.slice(i,i + 3);
      let minDistSq = Number.MAX_VALUE;
      let minIndex = -1;
      for (let j = 0; j < bPoints.length; j += 4) {
        const b = bPoints.slice(j,j + 3);
        const c = cPoints.slice(j,j + 3);
        const distSq = distanceFromPointToLineSquared(a,c);
        if (distSq < minDistSq) {
          minDistSq = distSq;
          minIndex = j / 4;
        }
      }
      closest.push(minIndex);
    }

    drawResults(document.querySelector('#js'),closest);
  }

  const vs = `
  attribute vec4 position;
  void main() {
    gl_Position = position;
  }
  `;
  
  const fs = `
  precision highp float;

  uniform sampler2D aValues;
  uniform vec2 aDimensions;  // the size of the aValues texture in pixels (texels)
  uniform sampler2D bValues;
  uniform vec2 bDimensions;  // the size of the bValues texture in pixels (texels)
  uniform sampler2D cValues;
  uniform vec2 cDimensions;  // the size of the cValues texture in pixels (texels)
  uniform vec2 outputDimensions; // the size of the thing we're drawing to (canvas)

  // this code,floor(minIndex / (256.0 * 256.0 * 256.0))) / 255.0;
  }
  `;
  
  // compile shader,link program,lookup locations
  const programInfo = twgl.createProgramInfo(gl,[vs,fs]);
  
  // calls gl.createBuffer,gl.bindBuffer,gl.bufferData for a -1 to +1 quad
  const bufferInfo = twgl.primitives.createXYQuadBufferInfo(gl);

  // make an RGBA float texture for each set of points
  // calls gl.createTexture,gl.bindTexture,gl.texImage2D,gl.texParameteri
  const aTex = twgl.createTexture(gl,{
    src: aPoints,width: aPoints.length / 4,type: gl.FLOAT,minMag: gl.NEAREST,});
  const bTex = twgl.createTexture(gl,{
    src: bPoints,width: bPoints.length / 4,});
  const cTex = twgl.createTexture(gl,{
    src: cPoints,width: cPoints.length / 4,});
    
  const numOutputs = aPoints.length / 4;
  gl.canvas.width = numOutputs;
  gl.canvas.height = 1;
  gl.viewport(0,numOutputs,1);
  
  gl.useProgram(programInfo.program);  
  
  // calls gl.bindBuffer,gl.enableVertexAttribArray,gl.vertexAttribPointer
  twgl.setBuffersAndAttributes(gl,programInfo,bufferInfo);
  
  // calls gl.activeTexture,gl.uniform
  twgl.setUniforms(programInfo,{
    aValues: aTex,aDimensions: [aPoints.length / 4,1],bValues: cTex,bDimensions: [bPoints.length / 4,cValues: bTex,cDimensions: [cPoints.length / 4,outputDimensions: [aPoints.length / 4,});
  
  // draw the quad
  gl.drawElements(gl.TRIANGLES,6,gl.UNSIGNED_SHORT,0);
  
  // get result
  const pixels = new Uint8Array(numOutputs * 4);
  const results = new Uint32Array(pixels.buffer);
  gl.readPixels(0,1,gl.RGBA,gl.UNSIGNED_BYTE,pixels);
  drawResults(document.querySelector('#glsl'),results);


  function drawResults(canvas,closest) {
    const ctx = canvas.getContext('2d');
    
    // draw the lines
    ctx.beginPath();
    for (let j = 0; j < bPoints.length; j += 4) {
      const b = bPoints.slice(j,j + 2);
      const c = cPoints.slice(j,j + 2);
      ctx.moveTo(...b);
      ctx.lineTo(...c);
    }
    ctx.strokeStyle = '#888';
    ctx.stroke();
    
    // draw the points and closest lines
    for (let i = 0; i < aPoints.length; i += 4) {
      const a = aPoints.slice(i,i + 2);
      const ndx = closest[i / 4] * 4;
      const b = bPoints.slice(ndx,ndx + 2);
      const c = cPoints.slice(ndx,ndx + 2);
      const color = hsl(i / aPoints.length,0.4);
      ctx.fillStyle = color;
      ctx.strokeStyle = color;
      ctx.fillRect(a[0] - 2,a[1] - 2,5,5);
      ctx.beginPath();
      ctx.moveTo(...b);
      ctx.lineTo(...c);
      ctx.stroke();
    }
  }

}
main();
canvas { border: 1px solid black; margin: 5px; }
<script src="https://twgljs.org/dist/4.x/twgl-full.min.js"></script>
<div>glsl</div>
<canvas id="glsl"></canvas>
<div>js</div>
<canvas id="js"></canvas>

如果您使用WebGL2,则可以使用texelFetch,以便getPoint成为

vec3 getPoint(in sampler2D tex,in int index) {
  ivec2 size = textureSize(tex,0);
  ivec2 uv = ivec2(index % size.x,index / size.x);
  return texelFetch(tex,uv,0).xyz;
}

,您无需传递输入纹理的大小,只需传递输出大小。另外,您可以使输出R32U和输出无符号整数索引,因此无需对结果进行编码。

注意:该代码假定您对a,b和c的运算量少于2048个,因此大部分代码假定为一维纹理。如果您需要超过2048个,则需要调整代码以使矩形纹理的大小适合您的数据,例如,如果您具有9000个值,则可以使用9x1000纹理。如果您有8999个值,那么由于纹理是2D数组,您仍然需要填充一个9x1000纹理以形成一个矩形。

还要注意,调用readPixels被认为是缓慢的。例如,如果您只想如上所述绘制结果,而不是渲染到画布上并通过readPixels读取值,则可以将结果渲染到纹理上,然后将纹理传递到另一个着色器中。


附录

这可能是错误的地方,但是作为对GLSL之类的简短解释,您可以将GLSL视为Array.prototype.map的理想版本。使用map时,您不会选择直接写入的内容。它是间接发生的。

const a = [1,2,3,4,5];
const b = a.map((v,index) => { return v * 2 + index; });

{ return v * 2 + index}部分类似于着色器。在JavaScript中,地图内部的函数返回值。在GLSL ES 1.0中,着色器将gl_FragColor设置为输出。在Javascript index中,是要写入的数组的索引(也恰好是输入数组的索引)。在GLSL中,gl_FragCoord扮演着相同的角色。

否则,顶点着色器的输出确定将要写入哪些像素(2D数组的哪些数组元素),从而使其成为map的更具选择性的版本。在上面的代码中,我们绘制了一个-1至+1的四边形,实际上是说“在所有像素上映射”。

实际上这是上述代码的一个版本,没有GLSL,只有JavaScript,但是JavaScript进行了重组,看起来更像GLSL。

const v3 = twgl.v3;

function main() {
  
  const r = max => Math.random() * max;
  const hsl = (h,${l * 100 | 0}%)`;

  function createPoints(numPoints) {
    const points = [];
    for (let i = 0; i < numPoints; ++i) {
      points.push(r(300),r);
  }

  const aPoints = createPoints(6);
  const bPoints = createPoints(15);
  const cPoints = createPoints(15);
  
  const gl_FragCoord = {};
  let gl_FragColor;
  
  const aValues = aPoints;
  const aDimensions = {}; // N/A
  const bValues = bPoints;
  const bDimensions = {}; // N/A
  const cValues = cPoints;
  const cDimensions = {}; // N/A
  const outputDimensions = {x: aPoints.length / 4,y: 1 };
  
  function getPoint(sampler,dimension,ndx) {
    return sampler.slice(ndx * 4,ndx * 4 + 3);
  }
  
  function javaScriptFragmentShader() {
    // gl_FragCoord is the coordinate of the pixel that is being set by the fragment shader.
    // It is the center of the pixel so the bottom left corner pixel will be (0.5,etc...
    // so we can compute back into a linear index 
    const ndx = Math.floor(gl_FragCoord.y) * outputDimensions.x + Math.floor(gl_FragCoord.x); 
    
    // find the closest points
    let minDist = 10000000.0; 
    let minIndex = -1.0;
    const a = getPoint(aValues,ndx);
    for (let i = 0; i < bPoints.length / 4; ++i) {
      const b = getPoint(bValues,i);
      const c = getPoint(cValues,i);
      const dist = distanceFromPointToLineSquared(a,c);
      if (dist < minDist) {
        minDist = dist;
        minIndex = i;
      }
    }
    
    // convert to 8bit color. The canvas defaults to RGBA 8bits per channel
    // so take our integer index (minIndex) and convert to float values that
    // will end up as the same 32bit index when read via readPixels as
    // 32bit values.
    gl_FragColor = [
      minIndex % 256.0,Math.floor(minIndex / 256.0) % 256.0,Math.floor(minIndex / (256.0 * 256.0)) % 256.0,Math.floor(minIndex / (256.0 * 256.0 * 256.0)),].map(v => v / 255.0);
  }
  
  // do it in JS to check
  {
    // compute closest lines to points
    
    const closest = [];
    const width = aPoints.length / 4;
    const height = 1;
    
    // WebGL drawing each pixel
    for (let y = 0; y < height; ++y) {
      for (let x = 0; x < width; ++x) {
        gl_FragCoord.x = x + 0.5;  // because pixels represent a rectangle one unit wide in pixel space
        gl_FragCoord.y = y + 0.5;  // so the center of each pixel in the middle of that rectangle
        javaScriptFragmentShader();
        const index = gl_FragColor[0] * 255 +
                      gl_FragColor[1] * 255 * 256 +
                      gl_FragColor[2] * 255 * 256 * 256 +
                      gl_FragColor[3] * 255 * 256 * 256 * 256;
        closest.push(index);
      }
    }

    drawResults(document.querySelector('#js'),closest);
  }

  function drawResults(canvas,5);
      ctx.beginPath();
      ctx.moveTo(...b);
      ctx.lineTo(...c);
      ctx.stroke();
    }
  }

}
main();
canvas { border: 1px solid black; margin: 5px; }
<script src="https://twgljs.org/dist/4.x/twgl-full.min.js"></script>
<canvas id="js"></canvas>

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。

相关推荐


依赖报错 idea导入项目后依赖报错,解决方案:https://blog.csdn.net/weixin_42420249/article/details/81191861 依赖版本报错:更换其他版本 无法下载依赖可参考:https://blog.csdn.net/weixin_42628809/a
错误1:代码生成器依赖和mybatis依赖冲突 启动项目时报错如下 2021-12-03 13:33:33.927 ERROR 7228 [ main] o.s.b.d.LoggingFailureAnalysisReporter : *************************** APPL
错误1:gradle项目控制台输出为乱码 # 解决方案:https://blog.csdn.net/weixin_43501566/article/details/112482302 # 在gradle-wrapper.properties 添加以下内容 org.gradle.jvmargs=-Df
错误还原:在查询的过程中,传入的workType为0时,该条件不起作用 &lt;select id=&quot;xxx&quot;&gt; SELECT di.id, di.name, di.work_type, di.updated... &lt;where&gt; &lt;if test=&qu
报错如下,gcc版本太低 ^ server.c:5346:31: 错误:‘struct redisServer’没有名为‘server_cpulist’的成员 redisSetCpuAffinity(server.server_cpulist); ^ server.c: 在函数‘hasActiveC
解决方案1 1、改项目中.idea/workspace.xml配置文件,增加dynamic.classpath参数 2、搜索PropertiesComponent,添加如下 &lt;property name=&quot;dynamic.classpath&quot; value=&quot;tru
删除根组件app.vue中的默认代码后报错:Module Error (from ./node_modules/eslint-loader/index.js): 解决方案:关闭ESlint代码检测,在项目根目录创建vue.config.js,在文件中添加 module.exports = { lin
查看spark默认的python版本 [root@master day27]# pyspark /home/software/spark-2.3.4-bin-hadoop2.7/conf/spark-env.sh: line 2: /usr/local/hadoop/bin/hadoop: No s
使用本地python环境可以成功执行 import pandas as pd import matplotlib.pyplot as plt # 设置字体 plt.rcParams[&#39;font.sans-serif&#39;] = [&#39;SimHei&#39;] # 能正确显示负号 p
错误1:Request method ‘DELETE‘ not supported 错误还原:controller层有一个接口,访问该接口时报错:Request method ‘DELETE‘ not supported 错误原因:没有接收到前端传入的参数,修改为如下 参考 错误2:cannot r
错误1:启动docker镜像时报错:Error response from daemon: driver failed programming external connectivity on endpoint quirky_allen 解决方法:重启docker -&gt; systemctl r
错误1:private field ‘xxx‘ is never assigned 按Altʾnter快捷键,选择第2项 参考:https://blog.csdn.net/shi_hong_fei_hei/article/details/88814070 错误2:启动时报错,不能找到主启动类 #
报错如下,通过源不能下载,最后警告pip需升级版本 Requirement already satisfied: pip in c:\users\ychen\appdata\local\programs\python\python310\lib\site-packages (22.0.4) Coll
错误1:maven打包报错 错误还原:使用maven打包项目时报错如下 [ERROR] Failed to execute goal org.apache.maven.plugins:maven-resources-plugin:3.2.0:resources (default-resources)
错误1:服务调用时报错 服务消费者模块assess通过openFeign调用服务提供者模块hires 如下为服务提供者模块hires的控制层接口 @RestController @RequestMapping(&quot;/hires&quot;) public class FeignControl
错误1:运行项目后报如下错误 解决方案 报错2:Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.8.1:compile (default-compile) on project sb 解决方案:在pom.
参考 错误原因 过滤器或拦截器在生效时,redisTemplate还没有注入 解决方案:在注入容器时就生效 @Component //项目运行时就注入Spring容器 public class RedisBean { @Resource private RedisTemplate&lt;String
使用vite构建项目报错 C:\Users\ychen\work&gt;npm init @vitejs/app @vitejs/create-app is deprecated, use npm init vite instead C:\Users\ychen\AppData\Local\npm-