GithubHelp home page GithubHelp logo

Comments (11)

takuma-hmng8 avatar takuma-hmng8 commented on June 1, 2024 1

This can be achieved more succinctly with r3f's createPortal.

https://twitter.com/tkm_hmng8/status/1744931239373770902

export const Home = () => {
   const ref = useRef<THREE.ShaderMaterial>(null);
   const { size, viewport, camera } = useThree();
   const dpr = viewport.dpr;
   const [updateNoise, setNoise] = useNoise({ size, dpr });
   const [updateFluid, setFluid] = useFluid({ size, dpr });
   const [updateFxBlending, setFxBlending] = useFxBlending({ size, dpr });
   const [updateColorStrata, setColorStrata] = useColorStrata({ size, dpr });
   const [updateBrightnessPicker] = useBrightnessPicker({
      size,
      dpr,
   });

   setFxBlending({
      mapIntensity: 0.45,
   });

   setNoise({
      scale: 0.01,
      warpOctaves: 1,
      noiseOctaves: 1,
      fbmOctaves: 1,
      timeStrength: 1.2,
      warpStrength: 20.0,
   });

   setFluid({
      density_dissipation: 0.96,
      velocity_dissipation: 0.99,
      curl_strength: 0.0,
      splat_radius: 0.0045,
      pressure_iterations: 1,
   });

   setColorStrata({
      laminateLayer: 4,
      laminateInterval: new THREE.Vector2(1, 1),
      laminateDetail: new THREE.Vector2(0.3, 0.3),
      distortion: new THREE.Vector2(2, 2),
      colorFactor: new THREE.Vector3(6.2, 4.2, 8.8),
      timeStrength: new THREE.Vector2(1, 1),
      noiseStrength: new THREE.Vector2(1, 1),
   });

   // This scene is rendered offscreen
   const offscreenScene = useMemo(() => new THREE.Scene(), []);

   // create FBO for offscreen rendering
   const [_, updateRenderTarget] = useSingleFBO({
      scene: offscreenScene,
      camera,
      size,
      dpr: viewport.dpr,
   });

   useFrame((props) => {
      const noise = updateNoise(props);
      const fluid = updateFluid(props);
      const blending = updateFxBlending(props, {
         texture: fluid,
         map: noise,
      });
      const picked = updateBrightnessPicker(props, {
         texture: blending,
      });
      const colorStrata = updateColorStrata(props, {
         texture: picked,
         noise: noise,
      });
      ref.current!.uniforms.u_fx.value = colorStrata;
      ref.current!.uniforms.u_texture.value = updateRenderTarget(props.gl);
   });

   return (
      <>
         {createPortal(
            <mesh>
               <ambientLight intensity={Math.PI} />
               <spotLight
                  position={[10, 10, 10]}
                  angle={0.15}
                  penumbra={1}
                  decay={0}
                  intensity={Math.PI}
               />
               <pointLight
                  position={[-10, -10, -10]}
                  decay={0}
                  intensity={Math.PI}
               />
               <Box position={[-1.5, 0, 0]} />
               <Box position={[1.5, 0, 0]} />
            </mesh>,
            offscreenScene
         )}
         <mesh>
            <planeGeometry args={[2, 2]} />
            <shaderMaterial
               ref={ref}
               transparent
               vertexShader={`
					varying vec2 vUv;
						void main() {
							vUv = uv;
							gl_Position = vec4(position, 1.0);
						}
						`}
               fragmentShader={`
						precision highp float;
						varying vec2 vUv;
						uniform sampler2D u_fx;
						uniform sampler2D u_texture;

						void main() {
							vec2 uv = vUv;
							vec3 noiseMap = texture2D(u_fx, uv).rgb;
							vec3 nNoiseMap = noiseMap * 2.0 - 1.0;
							uv = uv * 2.0 - 1.0;
							uv *= mix(vec2(1.0), abs(nNoiseMap.rg), 1.);
							uv = (uv + 1.0) / 2.0;

							vec3 texColor = texture2D(u_texture, uv).rgb;
							vec3 color = mix(texColor,noiseMap,0.5);

							float luminance = length(color);
							
							float edge0 = 0.0;
							float edge1 = .2;
							float alpha = smoothstep(edge0, edge1, luminance);

							gl_FragColor = vec4(color,alpha);

						}
					`}
               uniforms={{
                  u_texture: { value: null },
                  u_fx: { value: null },
               }}
            />
         </mesh>
      </>
   );
};

from use-shader-fx.

nhtoby311 avatar nhtoby311 commented on June 1, 2024 1

Thank you so much! I followed and managed to render the scene as the texture to the plane. I just have to make sure not including my scene postprocessing in createPortal(). However, render my exist scene to a plane did change the general colors looks of scene, like everything become slightly brighter and a bit more saturated, maybe due to tone mapping(?), and there seems to be no anti-aliasing applied as well.

from use-shader-fx.

takuma-hmng8 avatar takuma-hmng8 commented on June 1, 2024 1

@nhtoby311

Thank you so much for your detailed report! I found the cause of the problem.
Depending on the GPU driver settings? (I don't know how to reproduce it in detail), it seems that renderer is cleared by default in some cases and not in others.

codesandbox

gl.autoClear = true;
fxRef.current!.u_texture = updateRenderTarget(gl);

the problem should be solved by setting the renderer's autoClear to true, as shown here.

If you want to clear only the renderer of the FBO, you can execute clear() with the second argument of the FBO update function as follows. The second argument is the function that will be executed just before the execution of render().

fxRef.current!.u_texture = updateRenderTarget(gl,()=>{
      gl.clear();
});

In the latest v1.0.35, FBO is cleared by default before rendering.
I think it is very rare to intentionally not clear() FBO, and double buffering is more appropriate if you want to use motion blur, etc. By the way, there is also useDoubleFBO for double buffering.

from use-shader-fx.

nhtoby311 avatar nhtoby311 commented on June 1, 2024 1

Thank you so much for the help and explanation, I learned couple new things about FBO from this occasion as well! I updated to v1.0.35 and everything works perfectly now! 🎉

from use-shader-fx.

takuma-hmng8 avatar takuma-hmng8 commented on June 1, 2024

happy new year!
thanks for your questions!

Is this what you want to do?
The basic sample of r3f is used as a reference.

Setting shaderMaterial's transparent to true enables material opacity.

However, if you need to apply FX (Noise in this case) to an element of the scene (BOX in this case), you will need to make the texture from the scene.

function Box(props: any) {
   // This reference will give us direct access to the mesh
   const meshRef = useRef<THREE.Mesh>();
   // Set up state for the hovered and active state
   const [hovered, setHover] = useState(false);
   const [active, setActive] = useState(false);
   // Subscribe this component to the render-loop, rotate the mesh every frame
   useFrame((state, delta) => (meshRef.current!.rotation.x += delta));
   // Return view, these are regular three.js elements expressed in JSX
   return (
      <mesh
         {...props}
         ref={meshRef}
         scale={active ? 1.5 : 1}
         onClick={(event) => setActive(!active)}
         onPointerOver={(event) => setHover(true)}
         onPointerOut={(event) => setHover(false)}>
         <boxGeometry args={[1, 1, 1]} />
         <meshStandardMaterial color={hovered ? "hotpink" : "orange"} />
      </mesh>
   );
}

function PlaneOverlay() {
   const ref = useRef<THREE.ShaderMaterial>(null);
   const { size, dpr } = useThree((state) => {
      return { size: state.size, dpr: state.viewport.dpr };
   });
   const [updateNoise] = useNoise({ size, dpr });

   useFrame((props) => {
      const noise = updateNoise(props);
      ref.current!.uniforms.u_fx.value = noise;
   });

   return (
      <mesh position={[0, 0, 1]}>
         <planeGeometry args={[2, 2]} />
         <shaderMaterial
            ref={ref}
            transparent
            vertexShader={`
					varying vec2 vUv;
						void main() {
							vUv = uv;
							gl_Position = vec4(position, 1.0);
						}
						`}
            fragmentShader={`
						precision highp float;
						varying vec2 vUv;
						uniform sampler2D u_fx;

						void main() {
							vec2 uv = vUv;
							gl_FragColor = texture2D(u_fx, uv);
							gl_FragColor.a = 0.5;
						}
					`}
            uniforms={{
               u_fx: { value: null },
            }}
         />
      </mesh>
   );
}

export const Home = () => {
   return (
      <>
         <ambientLight intensity={Math.PI / 2} />
         <spotLight
            position={[10, 10, 10]}
            angle={0.15}
            penumbra={1}
            decay={0}
            intensity={Math.PI}
         />
         <pointLight position={[-10, -10, -10]} decay={0} intensity={Math.PI} />
         <Box position={[-1.2, 0, 0]} />
         <Box position={[1.2, 0, 0]} />
         <PlaneOverlay />
      </>
   );
};

from use-shader-fx.

nhtoby311 avatar nhtoby311 commented on June 1, 2024

I see, so adding transparent to the material and adjust alpha of gl_FragColor will make it transparent. I figure that if I calculate the luminance of the color, this will effectively set the color part of the FX to be visible, while hide the rest.

vec4 color = texture2D(u_fx, uv);

// Calculate the luminance of the color
float luminance = dot(color.rgb, vec3(0.299, 0.587, 0.114));
    
// Set the alpha channel to the luminance
gl_FragColor = vec4(color.rgb, luminance);

However, this only apply color to the empty texture plane, and it does not morph/displacing texture, like in this example. I wonder how can use the rendered scene, as the texture to the u_fx, so this effect could also be applied?

I created this codesandbox for clearer explanation: https://codesandbox.io/p/devbox/mouse-shader-fx-8pjvyd?file=%2Fsrc%2FUseBlending.tsx

from use-shader-fx.

takuma-hmng8 avatar takuma-hmng8 commented on June 1, 2024

However, this only apply color to the empty texture plane, and it does not morph/displacing texture, like in this example. I wonder how can use the rendered scene, as the texture to the u_fx, so this effect could also be applied?

I agree! What is drawn with r3f needs to be taken offscreen and turned into a texture.
This can be achieved by creating a new scene and adding the mesh you want to turn offscreen.

export const Home = () => {
  const { size, viewport, camera, gl } = useThree();

  // This scene is rendered offscreen
  const offscreenScene = useMemo(() => new THREE.Scene(), []);
  const offscreenMesh = useRef<THREE.Mesh>(null);
  // create FBO for offscreen rendering
  const [_, updateRenderTarget] = useSingleFBO({
    scene: offscreenScene,
    camera,
    size,
    dpr: viewport.dpr,
  });
  useEffect(() => {
    offscreenScene.add(offscreenMesh.current!);
  }, [offscreenScene]);

  // generate noise
  const shaderMaterial = useRef<THREE.ShaderMaterial>(null);
  const [updateNoise] = useNoise({ size, dpr: viewport.dpr });

  useFrame((props) => {
    shaderMaterial.current!.uniforms.u_fx.value = updateNoise(props);
    shaderMaterial.current!.uniforms.u_texture.value = updateRenderTarget(gl);
  });

  return (
    <>
      <mesh ref={offscreenMesh}>
        <ambientLight intensity={Math.PI / 2} />
        <spotLight
          position={[10, 10, 10]}
          angle={0.15}
          penumbra={1}
          decay={0}
          intensity={Math.PI}
        />
        <pointLight position={[-10, -10, -10]} decay={0} intensity={Math.PI} />
        <Box position={[-1.2, 0, 0]} />
        <Box position={[1.2, 0, 0]} />
      </mesh>
      <mesh>
        <planeGeometry args={[2, 2]} />
        <shaderMaterial
          ref={shaderMaterial}
          transparent
          vertexShader={`
                     varying vec2 vUv;
                         void main() {
                             vUv = uv;
                             gl_Position = vec4(position, 1.0);
                         }
                         `}
          fragmentShader={`
                         precision highp float;
                         varying vec2 vUv;
                         uniform sampler2D u_fx;
                         uniform sampler2D u_texture;
 
                         void main() {
                             vec2 uv = vUv;
                             vec3 noiseMap = texture2D(u_fx, uv).rgb;
                             vec3 nNoiseMap = noiseMap * 2.0 - 1.0;
                             uv = uv * 2.0 - 1.0;
                             uv *= mix(vec2(1.0), abs(nNoiseMap.rg), .6);
                             uv = (uv + 1.0) / 2.0;
                             gl_FragColor = texture2D(u_texture, uv);
                         }
                     `}
          uniforms={{
            u_texture: { value: null },
            u_fx: { value: null },
          }}
        />
      </mesh>
    </>
  );
};

I created a sandbox, so please take a look!
https://codesandbox.io/p/sandbox/r3f-use-shader-fx-kzzmfy

from use-shader-fx.

takuma-hmng8 avatar takuma-hmng8 commented on June 1, 2024

WebGL does not apply anti-aliasing to FBOs that render off-screen. Therefore, the WebGLRenderTarget in three.js has an option called samples. This allows Multi-Sampled Anti-Aliasing (MSAA) to be applied to the render buffer.

Since v1.0.34 of use-shader-fx, the number of samples can be set in the hook props.

const [updateNoise, setNoise] = useNoise({ size, dpr ,samples:4});

Incidentally, if you want to make fine adjustments to the renderTarget of the hook, the renderTarget is also stored in the object that the hook receives, as shown below, so fine adjustments can be made.

const [updateNoise, setNoise, noiseObj] = useNoise({ size, dpr });
useEffect(() => {
  noiseObj.renderTarget.samples = 8;
  noiseObj.renderTarget.depthBuffer = true;
  noiseObj.renderTarget.depthTexture = new THREE.DepthTexture(size.width, size.height,THREE.FloatType);
},[])

from use-shader-fx.

nhtoby311 avatar nhtoby311 commented on June 1, 2024

Thank you! I have update the package and use samples at useSingleFBO for anti-aliasing!

However, I notice that in the render buffer, its depth buffer is some what wrong, results in some mesh is render behind other mesh, where it shouldn't be. Here is the codesandbox and I attached below images for it.

  • Normal scene, with correct depth buffer:
    image

  • Depth buffer is not correct when scene is inside createPortal:
    image

from use-shader-fx.

takuma-hmng8 avatar takuma-hmng8 commented on June 1, 2024

@nhtoby311

I copied the code in its entirety and reproduced it, and the depthBuffer was also true!
Maybe codesandbox is not updating the library properly?

https://codesandbox.io/p/sandbox/r3f-use-shader-fx-kzzmfy

If you have experienced similar bugs locally or in other environments, please let us know.

const [renderTarget, updateRenderTarget] = useSingleFBO({
    scene: offscreenScene,
    camera,
    size,
    dpr: viewport.dpr,
    samples: 8,
    depthBuffer: true,
  });

  useEffect(() => {
    console.log(renderTarget.depthBuffer);
  }, []);

In this way, the output in the log is easy to understand.

Note that if the value cannot be true by any means, it can also be specified directly.

useEffect(() => {
    renderTarget.depthBuffer = true;
    console.log(renderTarget.depthBuffer);
  }, []);

from use-shader-fx.

nhtoby311 avatar nhtoby311 commented on June 1, 2024

Yes, you are right, something wrong with codesandbox in that case, I reinstall the package again and it works!

However, it appears to be a different issue I have in my environment. I have bunch of custom shader material code there, with them moving around the scene. And, I have to set their renderOrder, so they does not get incidentally clip out. But then if I bring the whole scene into the method above with createPortal and with depthBuffer: true, some of my moving/animated shader material will produce this trail artifact. Without depthBuffer: true, the scene with FX overlay render fine but as like previous, the wrong depth buffer will affects some part (regardless of renderOrder value in my environment).

I have managed to port on of my shader in the codesandbox and replicate the issue.

  • Before scene, outside of createPortal:
    image

  • Scene inside createPortal, with shader FX:
    image

from use-shader-fx.

Related Issues (16)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.