Im getting Tracks as muted for all type of streams like( webcam, microphone , scresnshare) whlie consuming the tracks

Guys im a beginner who were interested to explore many things and do. I know the code snipets are too long sorry for that . I think full code give you clear understanding.so i tried a mediasoup involved project like google meet. I’m stuck on this for 1 months i dont have any solution. Also i reconstructed my code like below . Before the code was different at that time i got the same issue with abortError " [intercept-console-error.ts:46 Playback error: AbortError: The play() request was interrupted by a call to pause()]" while the stream come and try to autoplay because of autoplay restriction i think so. I added a manual play button for that then also showing the same error . So i changed the entire code and tried the new code below now im getting black screen and not showing abortError

This is the worker creation code .

const totalThreads = os.cpus().length;

export const createWorkers = (): Promise<mediasoupTypes.Worker[]> =>
  new Promise(async (resolve, reject) => {
    let workers: mediasoupTypes.Worker[] = [];

    try {
      for (let i = 0; i < totalThreads; i++) {
        const worker: mediasoupTypes.Worker = await mediasoup.createWorker({
          rtcMinPort: config.workerSettings.rtcMinPort,
          rtcMaxPort: config.workerSettings.rtcMaxPort,
          logLevel: config.workerSettings.logLevel,
          logTags: config.workerSettings.logTags,
        });

        worker.on("died", () => {
          console.log("Worker has died");
          process.exit(1);
        });

        workers.push(worker);
      }

      resolve(workers);
    } catch (error) {
      reject(error);
    }
  });

  // Mediasoup Events In the backend 
      socket.on("getRtpCapabilities", (data, callback) => {
        try {
          const { streamId } = data;
          const router = this.routers[streamId];
          if (router) {
            callback({ rtpCapabilities: router.rtpCapabilities });
          } else {
            callback({ error: "Router not found" });
          }
        } catch (error) {
          console.error("Error getting RTP capabilities:", error);
          callback({ error: "Failed to get RTP capabilities" });
        }
      });

      socket.on("createWebRtcTransport", async (data, callback) => {
        try {
          const { streamId, direction } = data; // 'send' or 'recv'

          if (!streamId) {
            callback({ error: "Stream ID is required" });
            return;
          }

          const router = this.routers[streamId];
          if (!router) {
            callback({ error: "Router not found" });
            return;
          }

          if (
            !this.participants[streamId] ||
            !this.participants[streamId][socket.id]
          ) {
            callback({ error: "Participant not found" });
            return;
          }

          const participant = this.participants[streamId][socket.id];

          const transport = await router.createWebRtcTransport({
            listenIps: [{ ip: "0.0.0.0" }],
            enableUdp: true,
            enableTcp: true,
            preferUdp: true,
          });

          if (direction === "send") {
            participant.sendTransport = transport;
          } else {
            participant.receiveTransport = transport;
          }

          callback({
            id: transport.id,
            iceParameters: transport.iceParameters,
            iceCandidates: transport.iceCandidates,
            dtlsParameters: transport.dtlsParameters,
          });
        } catch (error) {
          console.error("Error creating WebRtcTransport:", error);
          callback({ error: "Failed to create transport" });
        }
      });

      socket.on("connectTransport", async (data, callback) => {
        try {
          const { streamId, transportId, dtlsParameters } = data;

          if (
            !streamId ||
            !this.participants[streamId] ||
            !this.participants[streamId][socket.id]
          ) {
            callback({ error: "Participant not found" });
            return;
          }

          const participant = this.participants[streamId][socket.id];
          const transport =
            participant.sendTransport?.id === transportId
              ? participant.sendTransport
              : participant.receiveTransport;

          if (!transport) {
            callback({ error: "Transport not found" });
            return;
          }

          await transport.connect({ dtlsParameters });
          callback({ success: true });
        } catch (error) {
          console.error("Error connecting transport:", error);
          callback({ error: "Failed to connect transport" });
        }
      });

      socket.on("produce", async (data, callback) => {
        try {
          const { streamId, kind, rtpParameters, appData } = data;

          if (
            !streamId ||
            !this.participants[streamId] ||
            !this.participants[streamId][socket.id]
          ) {
            callback({ error: "Participant not found" });
            return;
          }

          const participant = this.participants[streamId][socket.id];
          if (!participant.sendTransport) {
            callback({ error: "Send transport not found" });
            return;
          }

          const producer = await participant.sendTransport.produce({
            kind,
            rtpParameters,
            appData, // e.g., { source: "webcam" | "mic" | "screen" }
          });

          participant.producers.push(producer);

          // Add to stream producers list for new participants
          if (!this.streamProducers[streamId]) {
            this.streamProducers[streamId] = [];
          }

          this.streamProducers[streamId].push({
            id: producer.id,
            kind,
            userId: participant.userId,
            appData,
          });

          // Notify all participants about the new producer
          this.io.to(streamId).emit("newProducer", {
            producerId: producer.id,
            userId: participant.userId,
            kind,
            appData,
          });

          callback({ id: producer.id });

          // Listen for producer close events
          producer.on("transportclose", () => {
            console.log(`Producer ${producer.id} transport closed`);
            this.removeProducerFromStream(streamId, producer.id);
          });

          producer.on("@close", () => {
            console.log(`Producer ${producer.id} closed`);
            this.removeProducerFromStream(streamId, producer.id);
          });
        } catch (error) {
          console.error("Error producing:", error);
          callback({ error: "Failed to produce" });
        }
      });

      socket.on("pauseProducer", async (data, callback) => {
        try {
          const { producerId } = data;
          const streamId = this.socketToStream[socket.id];

          if (
            !streamId ||
            !this.participants[streamId] ||
            !this.participants[streamId][socket.id]
          ) {
            callback({ error: "Participant not found" });
            return;
          }

          const participant = this.participants[streamId][socket.id];
          const producer = participant.producers.find(
            (p) => p.id === producerId
          );

          if (!producer) {
            callback({ error: "Producer not found" });
            return;
          }

          await producer.pause();

          // Notify other participants with appData
          this.io.to(streamId).emit("producerPaused", {
            producerId,
            userId: participant.userId,
            appData: producer.appData,
          });

          callback({ success: true });
        } catch (error) {
          console.error("Error pausing producer:", error);
          callback({ error: "Failed to pause producer" });
        }
      });
      // Resume producer
      socket.on("resumeProducer", async (data, callback) => {
        try {
          const { producerId } = data;
          const streamId = this.socketToStream[socket.id];

          if (
            !streamId ||
            !this.participants[streamId] ||
            !this.participants[streamId][socket.id]
          ) {
            callback({ error: "Participant not found" });
            return;
          }

          const participant = this.participants[streamId][socket.id];
          const producer = participant.producers.find(
            (p) => p.id === producerId
          );

          if (!producer) {
            callback({ error: "Producer not found" });
            return;
          }

          await producer.resume();

          // Notify other participants with appData
          this.io.to(streamId).emit("producerResumed", {
            producerId,
            userId: participant.userId,
            appData: producer.appData,
          });

          callback({ success: true });
        } catch (error) {
          console.error("Error resuming producer:", error);
          callback({ error: "Failed to resume producer" });
        }
      });

      socket.on("consume", async (data, callback) => {
        try {
          const { producerId, rtpCapabilities } = data;
          const streamId = this.socketToStream[socket.id];
          if (!streamId) {
            callback({ error: "Stream ID is required" });
            return;
          }

          console.log(this.participants, "participants in consume");

          const router = this.routers[streamId];
          if (!router) {
            callback({ error: "Router not found" });
            return;
          }

          console.log(
            streamId,
            "streamId in consume",
            socket.id,
            "socketId in consume"
          );

          if (
            !this.participants[streamId] ||
            !this.participants[streamId][socket.id]
          ) {
            console.log(
              "participants in consume if condition",
              this.participants
            );
            console.log("socketId in consume if condition", socket.id);

            callback({ error: "Participant not found" });
            return;
          }

          const participant = this.participants[streamId][socket.id];
          if (!participant.receiveTransport) {
            callback({ error: "Receive transport not found" });
            return;
          }

          console.log("finding producer object");

          // Find the producer by ID
          let producerObject: mediasoupTypes.Producer | undefined;
          for (const p of Object.values(this.participants[streamId])) {
            producerObject = p.producers.find((prod) => prod.id === producerId);
            if (producerObject) break;
          }

          if (!producerObject) {
            callback({ error: "Producer not found" });
            return;
          }

          // Check if router can consume
          if (!router.canConsume({ producerId, rtpCapabilities })) {
            callback({
              error: "Router cannot consume with given RTP capabilities",
            });
            return;
          }

          const consumer = await participant.receiveTransport.consume({
            producerId,
            rtpCapabilities,
            paused: true, // Start paused, will resume after getting response
          });

          participant.consumers.push(consumer);

          console.log(participant.consumers, "consumers in consume");

          // Handle consumer events
          consumer.on("transportclose", () => {
            console.log(`Consumer ${consumer.id} transport closed`);
          });

          consumer.on("producerclose", () => {
            console.log(`Consumer ${consumer.id} producer closed`);
            const index = participant.consumers.findIndex(
              (c) => c.id === consumer.id
            );
            if (index !== -1) {
              participant.consumers.splice(index, 1);
            }
            // Notify the client that producer was closed
            socket.emit("consumerClosed", {
              consumerId: consumer.id,
              reason: "producer closed",
            });
          });
          console.log(
            "sending to the frontend consumer data",
            consumer.id,
            producerId,
            consumer.kind,
            consumer.rtpParameters,
            streamId
          );
          callback({
            id: consumer.id,
            producerId,
            kind: consumer.kind,
            rtpParameters: consumer.rtpParameters,
          });
        } catch (error) {
          console.error("Error consuming:", error);
          callback({ error: "Failed to consume" });
        }
      });

      socket.on("resumeConsumer", async (data, callback) => {
        try {
          const { consumerId } = data;

          const streamId = this.socketToStream[socket.id];

          if (!streamId) {
            callback({ error: "You are not connected to any stream" });
            return;
          }

          if (
            !this.participants[streamId] ||
            !this.participants[streamId][socket.id]
          ) {
            callback({ error: "Participant not found" });
            return;
          }

          const participant = this.participants[streamId][socket.id];
          const consumer = participant.consumers.find(
            (c) => c.id === consumerId
          );

          if (!consumer) {
            callback({ error: "Consumer not found" });
            return;
          }

          await consumer.resume();
          callback({ success: true });
        } catch (error) {
          console.error("Error resuming consumer:", error);
          callback({ error: "Failed to resume consumer" });
        }
      });

So the issue im facing is im getting the consumed Track as muted . because of that in every cilent the stream showing as a black screen . i noticed that consumed stream is muted:true for this 3 type of streams. Im trying that on the same system by taking two tabs(inco+ normal) for testing . and local pariticipant stream also im showing by consuming i know that is not good but for teting without taking another participant for consuming i made like this .I m stuck on this for 1 month is anybody know this please kindly share your thoughts on this . I shared my full code snippet of this .I’m using next js and node js for my this setup . i attached a image of console for your reference .

I posted FE code for your reference

**This is the Fronend side **


  // Initialize MediaSoup device
  const initializeDevice = async () => {
    try {
      if (!streamingSocket || !streamId) return;

      // Create a new MediaSoup device
      const device = new mediasoupClient.Device();

      // Get router RTP capabilities
      const { rtpCapabilities } = await new Promise<any>((resolve, reject) => {
        streamingSocket.emit(
          "getRtpCapabilities",
          { streamId },
          (response: any) => {
            if (response.error) {
              reject(response.error);
            } else {
              resolve(response);
            }
          }
        );
      });

      // Load the device with router RTP capabilities
      await device.load({ routerRtpCapabilities: rtpCapabilities });
      mediaState.current.device = device;
      console.log("MediaSoup device initialized", mediaState.current.device);

      // Create send and receive transports
      await createSendTransport();
      await createReceiveTransport();

      // Start camera and mic if initial states are true
      if (mediaState.current.cameraOn) {
        await startCamera();
      }
      if (mediaState.current.micOn) {
        await startMicrophone();
      }
    } catch (error) {
      console.error("Failed to initialize MediaSoup device:", error);
      toast.error("Failed to initialize streaming device");
    }
  };

  // Create send transport for sending media
  const createSendTransport = async () => {
    try {
      if (!mediaState.current.device || !streamingSocket || !streamId) return;

      const transportParams = await new Promise<any>((resolve, reject) => {
        streamingSocket.emit(
          "createWebRtcTransport",
          {
            streamId,
            direction: "send",
          },
          (response: any) => {
            if (response.error) {
              reject(response.error);
            } else {
              resolve(response);
            }
          }
        );
      });

      const sendTransport = mediaState.current.device.createSendTransport({
        id: transportParams.id,
        iceParameters: transportParams.iceParameters,
        iceCandidates: transportParams.iceCandidates,
        dtlsParameters: transportParams.dtlsParameters,
        sctpParameters: transportParams.sctpParameters,
      });

      // Handle transport events
      sendTransport.on(
        "connect",
        async (
          {
            dtlsParameters,
          }: { dtlsParameters: mediasoupClient.types.DtlsParameters },
          callback: () => void,
          errback: (error: any) => void
        ) => {
          try {
            await new Promise<void>((resolve, reject) => {
              streamingSocket?.emit(
                "connectTransport",
                {
                  transportId: sendTransport.id,
                  dtlsParameters,
                  streamId,
                },
                (response: any) => {
                  if (response.error) {
                    reject(response.error);
                  } else {
                    resolve();
                  }
                }
              );
            });
            callback();
          } catch (error) {
            errback(error as Error);
          }
        }
      );

      sendTransport.on(
        "produce",
        async (
          {
            kind,
            rtpParameters,
            appData,
          }: {
            kind: mediasoupClient.types.MediaKind;
            rtpParameters: mediasoupClient.types.RtpParameters;
            appData: any;
          },
          callback: (options: { id: string }) => void,
          errback: (error: Error) => void
        ) => {
          try {
            const { id } = await new Promise<any>((resolve, reject) => {
              streamingSocket?.emit(
                "produce",
                {
                  streamId,
                  transportId: sendTransport.id,
                  kind,
                  rtpParameters,
                  appData,
                },
                (response: any) => {
                  if (response.error) {
                    reject(response.error);
                  } else {
                    resolve(response);
                  }
                }
              );
            });
            callback({ id });
          } catch (error) {
            errback(error as Error);
          }
        }
      );

      mediaState.current.sendTransport = sendTransport;
      console.log("Send transport created:", sendTransport.id);
    } catch (error) {
      console.error("Failed to create send transport:", error);
      toast.error("Failed to establish streaming connection");
    }
  };

  // Create receive transport for receiving media
  const createReceiveTransport = async () => {
    try {
      if (!mediaState.current.device || !streamingSocket || !streamId) return;

      const transportParams = await new Promise<any>((resolve, reject) => {
        streamingSocket.emit(
          "createWebRtcTransport",
          {
            streamId,
            direction: "recv",
          },
          (response: any) => {
            if (response.error) {
              reject(response.error);
            } else {
              resolve(response);
            }
          }
        );
      });

      const receiveTransport = mediaState.current.device.createRecvTransport({
        id: transportParams.id,
        iceParameters: transportParams.iceParameters,
        iceCandidates: transportParams.iceCandidates,
        dtlsParameters: transportParams.dtlsParameters,
        sctpParameters: transportParams.sctpParameters,
      });

      // Handle transport connect event
      receiveTransport.on(
        "connect",
        async (
          {
            dtlsParameters,
          }: { dtlsParameters: mediasoupClient.types.DtlsParameters },
          callback: () => void,
          errback: (error: Error) => void
        ) => {
          try {
            await new Promise<void>((resolve, reject) => {
              streamingSocket?.emit(
                "connectTransport",
                {
                  transportId: receiveTransport.id,
                  dtlsParameters,
                  streamId,
                },
                (response: any) => {
                  if (response.error) {
                    reject(response.error);
                  } else {
                    resolve();
                  }
                }
              );
            });
            callback();
          } catch (error) {
            errback(error as Error);
          }
        }
      );

      mediaState.current.receiveTransport = receiveTransport;
      console.log("Receive transport created:", receiveTransport.id);
    } catch (error) {
      console.error("Failed to create receive transport:", error);
      toast.error("Failed to establish streaming connection");
    }
  };

  // Start camera
  const startCamera = async () => {
    try {
      console.log(
        mediaState.current.device?.canProduce("video"),
        "camera produce check",
        mediaState.current.sendTransport
      );
      if (
        !mediaState.current.device?.canProduce("video") ||
        !mediaState.current.sendTransport
      ) {
        toast.error("Cannot produce video - device not ready");
        return;
      }

      const existingProducer = mediaState.current.producers["camera"];
      console.log(existingProducer, "existingProducerrrrrrrrr");
      if (existingProducer) {
        // Get new media stream
        const stream = await navigator.mediaDevices.getUserMedia({
          video: {
            width: { ideal: 1280 },
            height: { ideal: 720 },
            frameRate: { ideal: 30 },
          },
        });
        localCameraStream.current = stream;

        if (localVideoRef.current) {
          localVideoRef.current.srcObject = stream;
          localVideoRef.current
            .play()
            .catch((err) => console.error("Error playing local video:", err));
        }

        const newTrack = stream.getVideoTracks()[0];
        await existingProducer.replaceTrack({ track: newTrack });
        await existingProducer.resume();
        if (streamingSocket) {
          streamingSocket.emit(
            "resumeProducer",
            { streamId, producerId: existingProducer.id },
            (response: any) => {
              if (response.error)
                console.error(
                  "Error resuming camera producer:",
                  response.error
                );
            }
          );
        }
        mediaState.current.cameraOn = true;
        mediaState.current.producerStreams[existingProducer.id] = stream;
        console.log("Camera resumed with producer ID:", existingProducer.id);
      } else {
        const stream = await navigator.mediaDevices.getUserMedia({
          video: {
            width: { ideal: 1280 },
            height: { ideal: 720 },
            frameRate: { ideal: 30 },
          },
        });
        localCameraStream.current = stream;

        if (localVideoRef.current) {
          localVideoRef.current.srcObject = stream;
          localVideoRef.current
            .play()
            .catch((err) => console.error("Error playing local video:", err));
        }

        const track = stream.getVideoTracks()[0];
        const producer = await mediaState.current.sendTransport.produce({
          track,
          encodings: [
            { maxBitrate: 100000, scaleResolutionDownBy: 4 },
            { maxBitrate: 300000, scaleResolutionDownBy: 2 },
            { maxBitrate: 900000 },
          ],
          codecOptions: { videoGoogleStartBitrate: 1000 },
          appData: { source: "webcam", userId: user._id },
        });

        mediaState.current.producers["camera"] = producer;
        mediaState.current.producerStreams[producer.id] = stream;
        mediaState.current.cameraOn = true;

        producer.on("transportclose", () => stopCamera());
        producer.on("trackended", () => stopCamera());

        console.log("Camera started with new producer ID:", producer.id);
      }
    } catch (error) {
      console.error("Error starting camera:", error);
      toast.error("Failed to start camera");
    }
  };

  const stopCamera = async () => {
    try {
      const producer = mediaState.current.producers["camera"];
      console.log(
        producer,
        "pausing producer in stop camera function",
        streamingSocketRef.current
      );
      if (!producer) return;

      if (!streamingSocketRef.current) {
        console.error("Streaming socket is not available in stopCamera");
        if (localCameraStream.current) {
          localCameraStream.current
            .getTracks()
            .forEach((track) => track.stop());
          localCameraStream.current = null;
        }
        if (localVideoRef.current) localVideoRef.current.srcObject = null;
        mediaState.current.cameraOn = false;
        return;
      }
      await producer.pause();

      streamingSocketRef.current.emit(
        "pauseProducer",
        { streamId, producerId: producer.id },
        (response: any) => {
          if (response.error)
            console.error("Error pausing camera producer:", response.error);
        }
      );

      if (localCameraStream.current) {
        localCameraStream.current.getTracks().forEach((track) => track.stop());
        localCameraStream.current = null;
      }

      if (localVideoRef.current) localVideoRef.current.srcObject = null;
      mediaState.current.cameraOn = false;
      console.log("Camera paused and tracks stopped");
    } catch (error) {
      console.error("Error pausing camera:", error);
    }
  };
  
  // Start microphone
  const startMicrophone = async () => {
    try {
      if (
        !mediaState.current.device?.canProduce("audio") ||
        !mediaState.current.sendTransport
      ) {
        toast.error("Cannot produce audio - device not ready");
        return;
      }

      const existingProducer = mediaState.current.producers["microphone"];
      if (existingProducer) {
        const stream = await navigator.mediaDevices.getUserMedia({
          audio: {
            echoCancellation: true,
            noiseSuppression: true,
            autoGainControl: true,
          },
        });
        localMicStream.current = stream;

        const newTrack = stream.getAudioTracks()[0];
        await existingProducer.replaceTrack({ track: newTrack });
        await existingProducer.resume();
        if (streamingSocket) {
          streamingSocket.emit(
            "resumeProducer",
            { streamId, producerId: existingProducer.id },
            (response: any) => {
              if (response.error)
                console.error("Error resuming mic producer:", response.error);
            }
          );
        }

        mediaState.current.micOn = true;
        mediaState.current.producerStreams[existingProducer.id] = stream;
        console.log(
          "Microphone resumed with producer ID:",
          existingProducer.id
        );
      } else {
        const stream = await navigator.mediaDevices.getUserMedia({
          audio: {
            echoCancellation: true,
            noiseSuppression: true,
            autoGainControl: true,
          },
        });
        localMicStream.current = stream;

        const track = stream.getAudioTracks()[0];
        const producer = await mediaState.current.sendTransport.produce({
          track,
          codecOptions: {
            opusStereo: true,
            opusDtx: true,
            opusFec: true,
            opusMaxPlaybackRate: 48000,
          },
          appData: { source: "microphone", userId: user._id },
        });

        mediaState.current.producers["microphone"] = producer;
        mediaState.current.producerStreams[producer.id] = stream;
        mediaState.current.micOn = true;

        producer.on("transportclose", () => stopMicrophone());
        producer.on("trackended", () => stopMicrophone());

        console.log("Microphone started with producer ID:", producer.id);
      }
    } catch (error) {
      console.error("Error starting microphone:", error);
      toast.error("Failed to start microphone");
    }
  };
  const stopMicrophone = async () => {
    try {
      const producer = mediaState.current.producers["microphone"];
      if (!producer || !streamingSocketRef.current) return;
      console.log(producer, "pausing producer in stop microphone function");
      await producer.pause();
      streamingSocketRef.current.emit(
        "pauseProducer",
        { streamId, producerId: producer.id },
        (response: any) => {
          if (response.error)
            console.error("Error pausing mic producer:", response.error);
        }
      );

      if (localMicStream.current) {
        localMicStream.current.getTracks().forEach((track) => track.stop());
        localMicStream.current = null;
      }

      mediaState.current.micOn = false;
      console.log("Microphone paused and tracks stopped");
    } catch (error) {
      console.error("Error pausing microphone:", error);
    }
  };
  // Start screen share
  const startScreenShare = async () => {
    try {
      if (
        !mediaState.current.device?.canProduce("video") ||
        !mediaState.current.sendTransport
      ) {
        toast.error("Cannot produce screen share - device not ready");
        return;
      }

      const existingVideoProducer = mediaState.current.producers["screen"];
      if (existingVideoProducer) {
        const stream = await navigator.mediaDevices.getDisplayMedia({
          video: {
            displaySurface: "monitor",
            width: { ideal: 1920 },
            height: { ideal: 1080 },
            frameRate: { ideal: 30 },
          },
          audio: true,
        });
        localScreenStream.current = stream;

        if (localScreenRef.current) {
          localScreenRef.current.srcObject = stream;
          localScreenRef.current
            .play()
            .catch((err) => console.error("Error playing screen share:", err));
        }

        const videoTrack = stream.getVideoTracks()[0];
        await existingVideoProducer.replaceTrack({ track: videoTrack });
        await existingVideoProducer.resume();
        if (streamingSocket) {
          streamingSocket.emit(
            "resumeProducer",
            { streamId, producerId: existingVideoProducer.id },
            (response: any) => {
              if (response.error)
                console.error(
                  "Error resuming screen producer:",
                  response.error
                );
            }
          );
        }

        const audioTrack = stream.getAudioTracks()[0];
        const existingAudioProducer =
          mediaState.current.producers["screen-audio"];
        if (audioTrack && existingAudioProducer) {
          await existingAudioProducer.replaceTrack({ track: audioTrack });
          await existingAudioProducer.resume();
          if (streamingSocket) {
            streamingSocket.emit(
              "resumeProducer",
              { streamId, producerId: existingAudioProducer.id },
              (response: any) => {
                if (response.error)
                  console.error(
                    "Error resuming screen-audio producer:",
                    response.error
                  );
              }
            );
          }
        }

        mediaState.current.screenShareOn = true;
        mediaState.current.producerStreams[existingVideoProducer.id] = stream;
        console.log(
          "Screen share resumed with producer ID:",
          existingVideoProducer.id
        );
      } else {
        const stream = await navigator.mediaDevices.getDisplayMedia({
          video: {
            displaySurface: "monitor",
            width: { ideal: 1920 },
            height: { ideal: 1080 },
            frameRate: { ideal: 30 },
          },
          audio: true,
        });
        localScreenStream.current = stream;

        if (localScreenRef.current) {
          localScreenRef.current.srcObject = stream;
          localScreenRef.current
            .play()
            .catch((err) => console.error("Error playing screen share:", err));
        }

        const videoTrack = stream.getVideoTracks()[0];
        const videoProducer = await mediaState.current.sendTransport.produce({
          track: videoTrack,
          encodings: [{ maxBitrate: 1500000 }],
          codecOptions: { videoGoogleStartBitrate: 1000 },
          appData: { source: "screen", userId: user._id },
        });

        mediaState.current.producers["screen"] = videoProducer;
        mediaState.current.producerStreams[videoProducer.id] = stream;
        mediaState.current.screenShareOn = true;

        videoProducer.on("transportclose", () => stopScreenShare());
        videoProducer.on("trackended", () => stopScreenShare());
        videoTrack.addEventListener("ended", () => stopScreenShare());

        const audioTrack = stream.getAudioTracks()[0];
        if (audioTrack) {
          const audioProducer = await mediaState.current.sendTransport.produce({
            track: audioTrack,
            codecOptions: { opusStereo: true, opusDtx: true },
            appData: { source: "screen-audio", userId: user._id },
          });
          mediaState.current.producers["screen-audio"] = audioProducer;
        }

        console.log("Screen share started with producer ID:", videoProducer.id);
      }
    } catch (error) {
      console.error("Error starting screen share:", error);
      toast.error("Failed to start screen sharing");
      mediaState.current.screenShareOn = false;
    }
  };

  const stopScreenShare = async () => {
    try {
      const videoProducer = mediaState.current.producers["screen"];
      if (videoProducer) {
        await videoProducer.pause();
        if (streamingSocketRef.current) {
          streamingSocketRef.current.emit(
            "pauseProducer",
            { streamId, producerId: videoProducer.id },
            (response: any) => {
              if (response.error)
                console.error("Error pausing screen producer:", response.error);
            }
          );
        }
      }

      const audioProducer = mediaState.current.producers["screen-audio"];
      if (audioProducer) {
        await audioProducer.pause();
        if (streamingSocket) {
          streamingSocket.emit(
            "pauseProducer",
            { streamId, producerId: audioProducer.id },
            (response: any) => {
              if (response.error)
                console.error(
                  "Error pausing screen-audio producer:",
                  response.error
                );
            }
          );
        }
      }

      if (localScreenStream.current) {
        localScreenStream.current.getTracks().forEach((track) => track.stop());
        localScreenStream.current = null;
      }

      if (localScreenRef.current) localScreenRef.current.srcObject = null;
      mediaState.current.screenShareOn = false;
      console.log("Screen share paused and tracks stopped");
    } catch (error) {
      console.error("Error pausing screen share:", error);
    }
  };

  // Consume remote producer
  const consumeTrack = async (
    producerId: string,
    userId: string,
    kind: "audio" | "video",
    appData: any
  ) => {
    try {
      if (
        !mediaState.current.device ||
        !mediaState.current.device.rtpCapabilities ||
        !mediaState.current.receiveTransport ||
        !streamingSocket
      ) {
        console.log(
          "consum track states",
          mediaState.current.device,
          "<- device",
          mediaState.current.receiveTransport,
          "<-recvTransport",
          streamingSocket,
          "<-ssocket"
        );
        console.warn("Cannot consume track - device not ready");
        return;
      }

      // Request to consume the track
      const { id, rtpParameters } = await new Promise<any>(
        (resolve, reject) => {
          streamingSocket.emit(
            "consume",
            {
              producerId,
              rtpCapabilities: mediaState.current.device!.rtpCapabilities,
            },
            (response: any) => {
              if (response.error) {
                reject(response.error);
              } else {
                resolve(response);
              }
            }
          );
        }
      );

      // Create consumer
      const consumer = await mediaState.current.receiveTransport.consume({
        id,
        producerId,
        kind,
        rtpParameters,
        appData: { ...appData, userId },
      });

      // Store the consumer
      mediaState.current.consumers[consumer.id] = consumer;

      console.log(
        mediaState.current.consumers,
        "consumers in consume track function previous to resume",
        consumer,
        "from backend"
      );

      // Resume the consumer
      await new Promise<void>((resolve, reject) => {
        streamingSocket.emit(
          "resumeConsumer",
          {
            streamId,
            consumerId: consumer.id,
          },
          (response: any) => {
            if (response.error) {
              reject(response.error);
            } else {
              resolve();
            }
          }
        );
      });

      // Create a new stream from the consumer's track
      const stream = new MediaStream([consumer.track]);
      console.log(
        stream.getTracks(),
        "stream got from the backend after consuminggggggggggggg"
      );
      mediaState.current.consumerStreams[consumer.id] = stream;

      // Handle consumer events
      consumer.on("transportclose", () => {
        console.log(`Consumer ${consumer.id} transport closed`);
        delete mediaState.current.consumers[consumer.id];
        delete mediaState.current.consumerStreams[consumer.id];
      });

      consumer.on("@close", () => {
        const stream = mediaState.current.consumerStreams[consumer.id];
        if (stream) {
          setParticipantStreams((prev) => {
            const newStreams = { ...prev };
            for (const userId in newStreams) {
              if (newStreams[userId].camera === stream) {
                newStreams[userId].camera = null;
              }
              if (newStreams[userId].screen === stream) {
                newStreams[userId].screen = null;
              }
            }
            return newStreams;
          });
        }
        delete mediaState.current.consumers[consumer.id];
        delete mediaState.current.consumerStreams[consumer.id];
      });

      console.log(
        `Consuming ${kind} track from ${userId} with consumer ID ${consumer.id}`
      );

      return {
        consumerId: consumer.id,
        stream,
        kind,
        userId,
        appData,
      };
    } catch (error) {
      console.error("Error consuming track:", error);
      toast.error("Failed to receive remote stream");
      return null;
    }
  };
  // Initialize MediaSoup device when streamId becomes available
  useEffect(() => {
    if (streamingSocket && streamId) {
      initializeDevice();
    }

  }, [streamingSocket, streamId]);

  useEffect(() => {
    if (streamingSocket) {
      console.log("Streaming socket is connected in liveStudio");
      streamingSocket.emit("joinStudio", { role, user, channelData });

      const handleProducersList = async (producers: any[]) => {
        for (const { producerId, userId, kind, appData } of producers) {
          await consumeTrack(producerId, userId, kind, appData);
        }
      };

      const handleNewProducer = async (producer: any) => {
        const { producerId, userId, kind, appData } = producer;
        const result = await consumeTrack(producerId, userId, kind, appData);
        if (result) {
          const { stream, userId, appData } = result;
          setParticipantStreams((prev) => {
            const newStreams = { ...prev };
            if (!newStreams[userId]) {
              newStreams[userId] = { camera: null, screen: null };
            }
            if (appData.source === "webcam") {
              newStreams[userId].camera = stream;
            } else if (appData.source === "screen") {
              newStreams[userId].screen = stream;
            }
            return newStreams;
          });
        }
      };
      streamingSocket.on(
        "producerPaused",
        ({ producerId, userId }: { producerId: any; userId: any }) => {
          console.log(`Producer ${producerId} paused for user ${userId}`);
          setParticipants((prev) =>
            prev.map((p) =>
              p.userId === userId
                ? {
                    ...p,
                    cameraOn:
                      p.cameraOn &&
                      producerId !== mediaState.current.producers["camera"]?.id,
                    micOn:
                      p.micOn &&
                      producerId !==
                        mediaState.current.producers["microphone"]?.id,
                    screenShareOn:
                      p.screenShareOn &&
                      producerId !== mediaState.current.producers["screen"]?.id,
                  }
                : p
            )
          );
        }
      );
      // Handle producer closed notification
      streamingSocket.on(
        "producerClosed",
        ({ producerId }: { producerId: any }) => {
          console.log(`Producer ${producerId} closed`);
          // Update UI if needed, though consumerClosed may suffice
        }
      );

    

      streamingSocket.on("producersList", handleProducersList);
      streamingSocket.on("newProducer", handleNewProducer);

      streamingSocket.on(
        "producerPaused",
        ({
          producerId,
          userId,
          appData,
        }: {
          producerId: any;
          userId: any;
          appData: any;
        }) => {
          console.log(`Producer ${producerId} paused for user ${userId}`);
          if (appData.source === "webcam") {
            setParticipants((prev) =>
              prev.map((p) =>
                p.userId === userId ? { ...p, cameraOn: false } : p
              )
            );
          } else if (appData.source === "microphone") {
            setParticipants((prev) =>
              prev.map((p) =>
                p.userId === userId ? { ...p, micOn: false } : p
              )
            );
          } else if (appData.source === "screen") {
            setParticipants((prev) =>
              prev.map((p) =>
                p.userId === userId ? { ...p, screenShareOn: false } : p
              )
            );
          }
        }
      );

  

      streamingSocket.on(
        "producerResumed",
        ({
          producerId,
          userId,
          appData,
        }: {
          producerId: any;
          userId: any;
          appData: any;
        }) => {
          console.log(`Producer ${producerId} resumed for user ${userId}`);
          if (appData.source === "webcam") {
            setParticipants((prev) =>
              prev.map((p) =>
                p.userId === userId ? { ...p, cameraOn: true } : p
              )
            );
          } else if (appData.source === "microphone") {
            setParticipants((prev) =>
              prev.map((p) => (p.userId === userId ? { ...p, micOn: true } : p))
            );
          } else if (appData.source === "screen") {
            setParticipants((prev) =>
              prev.map((p) =>
                p.userId === userId ? { ...p, screenShareOn: true } : p
              )
            );
          }
        }
      );

      return () => {
        streamingSocket.off("producersList", handleProducersList);
        streamingSocket.off("newProducer", handleNewProducer);

      };
    }
  }, [streamingSocket, role, user, channelData]);

This is the component that im passing the toggle fucntoin and streams fo rendering

    <div
      className="flex-1 p-4 overflow-auto"
      style={{ background: streamSettings.background }}
    >
      <div className="grid grid-cols-2 gap-4">
        {participants.map((participant) => {
          const cameraStream = participantStreams[participant.userId]?.camera;
          const screenStream = participantStreams[participant.userId]?.screen;
          return (
            <div key={participant.userId} className="flex flex-col gap-2">
              {cameraStream && (
                <div>
                  <p>{participant.username} (Camera)</p>
                  <video
                    autoPlay
                    playsInline
                    muted={true}
                    ref={(el) => handleVideoRef(el, cameraStream)}
                    className="w-full h-auto bg-black rounded"
                  />
                </div>
              )}
              {screenStream && (
                <div>
                  <p>{participant.username} (Screen)</p>
                  <video
                    autoPlay
                    playsInline
                    ref={(el) => handleVideoRef(el, screenStream)}
                    className="w-full h-auto bg-black rounded"
                  />
                </div>
              )}
            </div>
          );
        })}
      </div>
    </div>
  );
};

Check whether the recv transport state in mediasoup-client is connected (so ICE and DTLS are connected) or not.

In this you can see that the I logged the recieveTransport in the consumeTrack function

Sorry but I am not asking you to show more logs. I am recommending that you check Transport ICE and DTLS status because if those are not connected then of course you won’t receive any media stream and the track will be marked as muted.

Here currently the streamer consuming its own stream if they turn on the camera for their local preview they need to consume(for debugging purpose instead of using other participants).Before consuming im cheking and ensuring that transport are exist also in the client in the "InitMediasoup function im calling the CreateTransport for both send and recieve intialy in the UseEffect while rendering first time.that will iisten by corresponding backend events like 'connectTransprot that is working fine.

may i know how to check the ICE and DTLS status? Currently im getting streams right that means its connected

Please read the API documentation of mediasoup-client in mediasoup.org and check the events of Transport.

This is the transport parameter i received in the client side