www.digitalmars.com         C & C++   DMDScript  

digitalmars.D.learn - What's wrong with writefln("%s", glGetString(GL_EXTENSIONS));

reply Cris <central_p hotmail.com> writes:
What's wrong with writefln("%s", glGetString(GL_EXTENSIONS));

printf works but not writefln?

How can I get the returned string by GLubyte* glGetString(); into a D 
char[]?

:) I hope I've expressed myself clear enough. Sorry if I've just wrote 
incomprehensible garbage.
Mar 01 2006
parent reply =?ISO-8859-1?Q?Anders_F_Bj=F6rklund?= <afb algonet.se> writes:
Cris wrote:

 What's wrong with writefln("%s", glGetString(GL_EXTENSIONS));
 
 printf works but not writefln?
 
 How can I get the returned string by GLubyte* glGetString(); into a D 
 char[]?

std.string.toString: writefln("%s", toString(glGetString(GL_EXTENSIONS))); --anders
Mar 01 2006
parent reply Cris <central_p hotmail.com> writes:
it doesn't work that way: "src\engine\renderer.d(104): cannot implicitly 
convert expression ((glGetString)(7939u)) of type ubyte* to char*"


Anders F Björklund wrote:
 Cris wrote:
 
 What's wrong with writefln("%s", glGetString(GL_EXTENSIONS));

 printf works but not writefln?

 How can I get the returned string by GLubyte* glGetString(); into a D 
 char[]?

std.string.toString: writefln("%s", toString(glGetString(GL_EXTENSIONS))); --anders

Mar 04 2006
parent reply Chris Sauls <ibisbasenji gmail.com> writes:
Cris wrote:
 it doesn't work that way: "src\engine\renderer.d(104): cannot implicitly 
 convert expression ((glGetString)(7939u)) of type ubyte* to char*"
 

Given that its natural type is ubyte* you should be able to do this: # private import std.string ; # # char[] toString (ubyte* foo) { # return std.string.toString(cast(char*) foo); # } I would think so, anyhow. If it will not allow a direct cast to char*, you can always add a middle state cast to void* first. -- Chris Nicholson-Sauls
Mar 04 2006
parent Cris <central_p hotmail.com> writes:
Thank you, Chris,

I think this will work for me.


Chris Sauls wrote:
 Cris wrote:
 it doesn't work that way: "src\engine\renderer.d(104): cannot 
 implicitly convert expression ((glGetString)(7939u)) of type ubyte* to 
 char*"

Given that its natural type is ubyte* you should be able to do this: # private import std.string ; # # char[] toString (ubyte* foo) { # return std.string.toString(cast(char*) foo); # } I would think so, anyhow. If it will not allow a direct cast to char*, you can always add a middle state cast to void* first. -- Chris Nicholson-Sauls

Mar 04 2006