发送使用unity照相机拍摄的实时图像

2024-06-26 17:43:49 发布

您现在位置:Python中文网/ 问答频道 /正文

服务器

private void SendImageByte()
    {
      
        image_bytes = cm.Capture();
       
    print(image_bytes.Length);
   

    if (connectedTcpClient == null)
    {
        return;
    }

    try
    {
        // Get a stream object for writing.             
        NetworkStream stream = connectedTcpClient.GetStream();
        if (stream.CanWrite)
        {
           // string serverMessage = "This is a message from your server.";
            // Convert string message to byte array.                 
            byte[] serverMessageAsByteArray = Encoding.ASCII.GetBytes(image_bytes.ToString());
            // Write byte array to socketConnection stream.
            
            stream.Write(serverMessageAsByteArray, 0, serverMessageAsByteArray.Length);
            Debug.Log("Server sent his message - should be received by client");
        }
    }
    catch (SocketException socketException)
    {
        Debug.Log("Socket exception: " + socketException);
    }
}

客户

import socket

host = "127.0.0.1"
port = 1755
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.connect((host, port))

def receive_image():
    data = sock.recv(999999).decode('utf-8')
    print(len(data))

while True:
    receive_image()

在这里,脚本从unity照相机捕获图像

public byte[] Capture()
    {
        if(renderTexture == null)
        {
          
            // creates off-screen render texture that can rendered into
            rect = new Rect(0, 0, captureWidth, captureHeight);
            renderTexture = new RenderTexture(captureWidth, captureHeight, 24);
            screenShot = new Texture2D(captureWidth, captureHeight, TextureFormat.RGB24, false);
        }

       // _camera = GetComponent<Camera>();
        _camera.targetTexture = renderTexture;
        _camera.Render();

        // reset active camera texture and render texture
        _camera.targetTexture = null;
        RenderTexture.active = null;

        // read pixels will read from the currently active render texture so make our offscreen 
        // render texture active and then read the pixels
        RenderTexture.active = renderTexture;
        screenShot.ReadPixels(rect, 0, 0);
        screenShot.Apply();

        byte[] imageBytes = screenShot.EncodeToPNG();
        //Object.Destroy(screenShot);

        //File.WriteAllBytes(Application.dataPath + "/../"+ imagePath + "/img{counter}.png", bytes);
        //counter = counter + 1;
        return imageBytes;
    }

我试图在Unity3D上使用套接字通信将实时图像从C#发送到python进行处理,并将值返回unity,但问题是,即使客户端接收到的字节长度也与服务器不同。我发送了大约40万字节,但只收到13个字节 C#是服务器,python是客户端

或者我做错了,但我的主要目标是创建udacity自动驾驶模拟器


Tags: image服务器messagestreamifbytessocketbyte
1条回答
网友
1楼 · 发布于 2024-06-26 17:43:49

您确定image_bytes.ToString()返回您期望的内容,而不是像"System.Byte[]"=>;13个字符=>;13字节

一般来说,为什么您要将一个已经byte[]转换为string只是为了将其转换回byte[]以发送?我很确定你不想用UTF-8传输二进制图像数据。。。一个选项可能是Base64字符串,但这仍然是非常低效的

只需发送原始字节,如

stream.Write(image_bytes, 0, image_bytes.Length);

然后接收,直到你收到那个长度

一个典型的解决方案是预先设置要发送的消息的长度,并且在接收方实际等待,直到您完全接收到该数量的字节,例如

var lengthBytes = BitConverter.GetBytes(image_bytes.Length);
stream.Write(lengthBytes, 0, lengthBytes.Length);
stream.Write(image_bytes, 0, image_bytes.Length);

现在您知道,在接收器端,您首先必须准确地接收4字节(=1 int),这将告诉您实际有效负载要接收多少字节

现在我不是python专家,但在谷歌搜索了一下之后,我觉得

def receive_image()
    lengthBytes = sock.recv(4)
    length = struct.unpack("!i", lengthBytes)[0]

    data = sock.recv(length)

注意:在阅读了John Gordon关于这个问题的评论之后,我想这仍然不能完全解决等待缓冲区被实际填满的问题——正如python专家所说的那样——但我希望它能给你一个方向;)

相关问题 更多 >