November 16, 2011

gkroeger gkroeger
Lab Rat
40 posts

QString and c++ unicode escapes

 

Excuse my ignorance, but I can’t figure out what is up with the following:

  1. QString string 1 = new QString(" \u0394");

and

  1. QString string 2 = new QString(" ");
  2. string2 += QChar(0x0394);

don’t yield the same result. The latter is what I want (uppercase greek delta) but I can figure out what the first is doing?

Thanks
Glenn

[EDIT: code formatting, Volker]

2 replies

November 16, 2011

Gerolf Gerolf
Robot Herder
3253 posts

Hi,

the fact is trivial.
In the first case, you have an char* string which contains some bytes. this is different to utf16 (where the \oXXX makes sense). The string is converted to a QString by the loacle you use.

This worked:

  1.     QString string1(" \u0394");
  2.     QString string2(" ");
  3.     string2 += QChar(0x0394);
  4.     QString string3 = QString::fromWCharArray(L" \u0394");
  5.  
  6.     label_1->setText(string1);
  7.     label_2->setText(string2);
  8.     label_3->setText(string3);

Label 2 and 3 show the correct sign.

 Signature 

Nokia Certified Qt Specialist.
Programming Is Like Sex: One mistake and you have to support it for the rest of your life. (Michael Sinz)

November 16, 2011

gkroeger gkroeger
Lab Rat
40 posts

Thanks Gerolf!

My brain wasn’t working so well this morning. As you point out, what is happening is obvious since the compiler replaces the literals before QString sees them.

 
  ‹‹ Class has no metaobject information?      [Solved]QByteArray to QString. ››

You must log in to post a reply. Not a member yet? Register here!