我有这段代码:
private void AnswerToCe(int currentBlock, int totalBlock = 0) { byte[] bufferToSend; byte[] macDst = mac; byte[] macSrc = ConnectionManager.SInstance.GetMyMAC(); byte[] ethType; byte[] header; if (Function == FlashFunction.UPLOAD_APPL || Function == FlashFunction.UPLOAD_BITSTREAM) { ethType = BitConverter.GetBytes((ushort)EthType.ETH_TYPE_UPLOAD); ethType = new byte[] { ethType[1], ethType[0] }; header = Header.GetBytes((ushort)binaryBlocks.Count, (ushort)(currentBlock + 1), (ushort)binaryBlocks[currentBlock].Length); int index = 0; bufferToSend = new byte[macDst.Length + macSrc.Length + ethType.Length + header.Length + binaryBlocks[currentBlock].Length]; Array.Copy(macDst, 0, bufferToSend, index, macDst.Length); index += macDst.Length; Array.Copy(macSrc, 0, bufferToSend, index, macSrc.Length); index += macSrc.Length; Logger.SInstance.Write(index.ToString(), "test index pre"); Array.Copy(ethType, 0, bufferToSend, index, ethType.Length); index += ethType.Length; Logger.SInstance.Write(index.ToString(), "test index post"); Array.Copy(header, 0, bufferToSend, index, header.Length); index += header.Length; Array.Copy(binaryBlocks[currentBlock], 0, bufferToSend, index, binaryBlocks[currentBlock].Length); }如果我在调试模式下构建我的应用程序,一切正常, test index pre打印12和test index post打印14.在释放模式下, Optimize code未选中相同。 如果我用Optimize code检查test index post打印18而不是14。 如果我替换index += ethType.Length;结果相同index += ethType.Length; index += 2; 。 似乎只有index++;index++; 工作中。 我在一个空应用程序中试过这段代码,总和都没问题。 应用程序是多线程的,但这里并没有并发性。 从DLL反编译的代码似乎没问题。 任何想法为什么发生这种情况
编辑:仅当应用程序编译为x64时发生。 x86是好的。 编辑3:构建环境的一些信息: visual studio 15.0.0-RTW + 26228.4 框架4.7.02053 可以在框架4.6.2和4.7中引发此问题。 其他框架未经测试。 编辑5: 新的,更小的示例项目 。 不需要依赖关系。 编辑6: 在这里拆卸测试项目。 (在这里发布时间太长)
I've this piece of code:
private void AnswerToCe(int currentBlock, int totalBlock = 0) { byte[] bufferToSend; byte[] macDst = mac; byte[] macSrc = ConnectionManager.SInstance.GetMyMAC(); byte[] ethType; byte[] header; if (Function == FlashFunction.UPLOAD_APPL || Function == FlashFunction.UPLOAD_BITSTREAM) { ethType = BitConverter.GetBytes((ushort)EthType.ETH_TYPE_UPLOAD); ethType = new byte[] { ethType[1], ethType[0] }; header = Header.GetBytes((ushort)binaryBlocks.Count, (ushort)(currentBlock + 1), (ushort)binaryBlocks[currentBlock].Length); int index = 0; bufferToSend = new byte[macDst.Length + macSrc.Length + ethType.Length + header.Length + binaryBlocks[currentBlock].Length]; Array.Copy(macDst, 0, bufferToSend, index, macDst.Length); index += macDst.Length; Array.Copy(macSrc, 0, bufferToSend, index, macSrc.Length); index += macSrc.Length; Logger.SInstance.Write(index.ToString(), "test index pre"); Array.Copy(ethType, 0, bufferToSend, index, ethType.Length); index += ethType.Length; Logger.SInstance.Write(index.ToString(), "test index post"); Array.Copy(header, 0, bufferToSend, index, header.Length); index += header.Length; Array.Copy(binaryBlocks[currentBlock], 0, bufferToSend, index, binaryBlocks[currentBlock].Length); }If I build my application in Debug mode everything is ok, test index pre prints 12 and test index post prints 14. the same in Release mode with Optimize code unchecked. if i test with Optimize code checked test index post prints 18 instead of 14. Same result if I replace index += ethType.Length; with index += 2;. seems only index++;index++; is working. I tried this code in an empty application and sums are ok. App is multithreading but there isn't no concurrency here. Decompiled code from DLL seems ok. Any ideas why this happen?
EDIT: Happens only when app is compiled for x64. x86 is ok. EDIT 3: some info of the build env: visual studio 15.0.0-RTW+26228.4 framework 4.7.02053 can trigger this issue on framework 4.6.2 and 4.7. other frameworks aren't tested. EDIT 5: new, smaller example project. no dependencies needed. EDIT 6: disassembly of the test project here. (too long to post it here)
最满意答案
这是RyuJIT已经报告的错误,更多细节在这里 。 将立即在修补程序中修复。
It was an already reported bug in RyuJIT, more details here. Will be fixed in an hotfix soon.
更多推荐
发布评论