I need help about libmediasoupclient recvtransport

Now can send audio and video, In broadcaster demo create SendTransport for audio and video.
How to create recvtransport for audio and video. Can you write some indicative code here?

Follow the libmediasoupclient API documentation. We have written it for something.

I read the libmediasoupclient api document,but CreateRecvTransport has no example.
I write some code for it. but throw

// Ensure the device can consume it.
auto canConsume = ortc::canReceive(*rtpParameters, *this->extendedRtpCapabilities);
if (!canConsume)
throw Exception(“cannot consume this Producer”);

I modify Broadcaster.cpp like this after create CreateSendTransport ok(createaudiotrack createvideotrack):
auto recvTransport = this->device.CreateRecvTransport(
this,
this->transportId,
response[“iceParameters”],
response[“iceCandidates”],
response[“dtlsParameters”]);

std::unordered_map<std::string, mediasoupclient::Producer*> producers =
  sendTransport->GetProducers();
if (this->device.CanProduce("audio"))
{		
	for (auto& producer: producers)
	{
		mediasoupclient::Producer* pro = producer.second;
		if (pro->GetKind() == "audio")
		{
			recvTransport->Consume(
			  this,
			  std::to_string(rtc::CreateRandomId()),
			  pro->GetId(),
			  pro->GetKind(),
			  &pro->GetRtpParameters());

			break;
		}
	}
}

producers is sendTransport created audioproducer and videoproducer

where is wrong?

Check this out, a simple mediasoup client/server implementation:

Thanks.But i have a little knowledge about nodejs.
I git install and run to see it.

can you tell me how to create recvTransport and consumer?

Why are you calling ortc::canReceive()? That’s is not public API, it’s not documented and you should not use it at all. libmediasoupclient will use it internally, you must just use the documented libmediasoupclient API (plus the libwebrtc API to get MediaStreamTracks and so on).