WebRTC is, among other things, meant for real-time browser to browser media communication, but in my case it will be used for browser to server audio communication.
From the information I've gathered, the MediaStream is transferred using RTP over UDP.
This will require at least two additional ports apart from the protocol used for signalling, something I would like to avoid.
Within WebRTC, is there any possibility to use RTP over Websocket instead of RTP over UDP so that I only need to use port 80 or 443?
No, that will not be possible using WebRTC.
WebRTC was built to give browsers three main features:
This features are accessible to web applications via a Javascript API defined here. To access media devices, you can use getUserMedia() and you will get a MediaStream to attach to HTML5 audio and video tags. To create an SRTP session, you need to create a peer connection and manage the streams to use.
You have to request the browser a media SDP offer and send it to the other party using any protocol (e.g. websockets). When the other party receives your SDP offer, it can inject it into the browser, request an SDP answer, and send it back. Once both browsers have the offers, they start the SRTP negotiation, using ICE.
So, you will not have access to RTP packets to send them over websockets.