Changes between Version 4 and Version 5 of Ticket #9801, comment 2
- Timestamp:
- Oct 4, 2023, 5:19:33 PM (2 years ago)
Legend:
- Unmodified
- Added
- Removed
- Modified
-
Ticket #9801, comment 2
v4 v5 1 1 The C# reading code allocates an array for the full size of the message and fills it as data is read, so there is no concatenation to slow down the read. 2 2 3 One possibility is that the async read is not getting CPU time very often because the Unity is not querying the task often enough, maybe only once per rendering update loop. The code currently reads while data is available and then does a stream.ReadSync() call. I could test that by always doing a synchronous stream.Read() until the whole message is read. Try that. The same problem could be slowing the writing to the socket using WriteAsync(). Can change that to Write() as a test. With synchronous read and write it took the same amount of time. Printing the synchronous read time gives 1.5 seconds for 31 Mbytes with a synchronous write. That is not too far off the expected time and could be due to stream buffering which is not optimal for large messages. If I use async read and write then the read time is reported as 1.7 seconds. So only about 15% slower with async. 3 One possibility is that the async read is not getting CPU time very often because the Unity is not querying the task often enough, maybe only once per rendering update loop. The code currently reads while data is available and then does a stream.ReadSync() call. I could test that by always doing a synchronous stream.Read() until the whole message is read. Try that. The same problem could be slowing the writing to the socket using WriteAsync(). Can change that to Write() as a test. With synchronous read and write it took the same amount of time. Printing the synchronous read time gives 1.5 seconds for 31 Mbytes with a synchronous write. That is not too far off the expected time and could be due to stream buffering which is not optimal for large messages. If I use async read and write then the read time is reported as 1.7 seconds. So only about 15% slower with async. Opening the model took 0.9 seconds after the message read. (Each of these timings was on the second load so file data and module initializations are done.) 4 4 5 5 Maybe the large JSON message encoding and decoding is also slow?