我找到了一个将十六进制表示法解码为十进制数的解释,但只使用Qt:
由于我不使用Qt和 cout<< (int)c does not work (编辑:如果您正确使用它,它实际上可以正常工作..!):
如何做到以下几点:
两个字符是通过某个套接字传输的(只是想出了如何最终得到十六进制repr!),并且两者都是在 utf16表示形式之后合成的:
char c = u\0b7f这将被转换为 2943 的utf16十进制值! (见utf-table http:// www .fileformat.info / info / unicode / char / 0b7f / index.htm )这应该是绝对基本的东西,但作为一个指定的Python开发人员被迫在一个项目中使用C ++我将这个问题悬挂了数小时....
解决方案使用更宽的字符类型( char 只有8位,你至少需要16位),以及正确的UTC文字格式。这可以工作(现场演示):
#include< iostream> int main() { char16_t c = u'\\\'; std :: cout<< (int)c<<的std :: ENDL; //如预期的那样输出是2943 return 0; }
I found an explanation to decode hex-representations into decimal but only by using Qt: How to get decimal value of a unicode character in c++
As I am not using Qt and cout << (int)c does not work (Edit: it actually does work if you use it properly..!):
How to do the following:
I got the hex representation of two chars which were transmitted over some socket (Just figured out how to get the hex repr finally!..) and both combined yield following utf16-representation:
char c = u"\0b7f"This shall be converted into it's utf16 decimal value of 2943! (see it at utf-table www.fileformat.info/info/unicode/char/0b7f/index.htm)
This should be absolut elementary stuff, but as a designated Python developer compelled to use C++ for a project I am hanging this issue for hours....
解决方案Use a wider character type (char is only 8 bits, you need at least 16), and also the correct format for UTC literals. This works (live demo):
#include <iostream> int main() { char16_t c = u'\u0b7f'; std::cout << (int)c << std::endl; //output is 2943 as expected return 0; }
更多推荐
C ++:将UTF16 char的十六进制表示转换为十进制(如python的int(hex
发布评论