Hot questions for Using Lightweight Java Game Library in crash

Question:

This issue is now fixed. My shader attributes were not bound correctly.

I have got a game, which when ran from the IDE looks like this:

However, when I export it from Eclipse with these settings,

The texturing is completely incorrect. The textures are still loaded, but they are not wrapping correctly onto the object.

The code is exactly the same, as I only just exported it, and I am currently running both windows of the game at the same time(One working fine from the IDE, one looking weird from an exported JAR).

I have also copied all the resources from the IDE directory to the folder with the external JAR. IDE directory:

Directory I run the external JAR from:

Also, I know that the textures are actually loading - They are not wrapping correctly. I know this because:

  • If you look at the plane, you can see that it still has elements of the texture - They are just all stretched and messed up.
  • Same thing with the Skybox. If you look at that, parts are still there, but again, it's incorrectly wrapped around the OBJ model.
  • If I hit the Q key to render the terrain with a DisplayList (Wrapping the terrain texture multiple times), it shows the texture. Can't get a screenshot of this because I can't hit Q and take a screenshot.

I have checked inside the JAR file, and the fragment and the vertex shader are still there. The correct libraries also appear to be there (Besides, if they weren't, the game would not even start).

Update: As I was restarting the game multiple times to check for more information, I noticed that as soon as the display shows (so I can see the incorrectly-textured terrain and plane), the game freezes about 90% of the time. No Stacktrace, just a "This window is not responding and windows is closing it". The game still works perfectly without crashing when I run it in the IDE.

The server for the game exports and runs perfectly. Only the client is the issue.

What about exporting could make the game any different than running it in the IDE, and how can I solve it?

Update: So here is my texture loading code:

loader.loadTexture("PlaneTexture", 1);
//loadTexture() method:
public int loadTexture(String fileName, int mipmap) {
    Texture texture = null;
    try {
        try{
        texture = TextureLoader.getTexture("PNG", new FileInputStream("res/" + fileName + ".png")); //TextureLoader is a Slick-Util class.
        }catch (Exception e){
            e.printStackTrace();
        }
        if (texture == null){
            throw new Exception("Null texture!");
        }
        //texture = TextureLoader.getTexture("GIF", new FileInputStream("res/" + fileName + ".png"));
        if (mipmap > -10){
       GL30.glGenerateMipmap(GL11.GL_TEXTURE_2D);
       GL11.glTexParameteri(GL11.GL_TEXTURE_2D, GL11.GL_TEXTURE_MIN_FILTER, GL11.GL_LINEAR_MIPMAP_LINEAR);
       GL11.glTexParameterf(GL11.GL_TEXTURE_2D, GL14.GL_TEXTURE_LOD_BIAS, mipmap);
        }
    } catch (Exception e) {
        e.printStackTrace();
        System.err.println("Tried to load texture " + fileName + ".png , didn't work");
        System.exit(1);
        return -1;


    }
    textures.add(texture.getTextureID());
    return texture.getTextureID();
}

I now have the texture ID of the texture. I then render the object (in this case the plane) like this:

Plane you = Main.TerrainDemo.shipsID.get(Main.TerrainDemo.UID);
    Main.TerrainDemo.shader.start();
    TexturedModel texturedModel = TerrainDemo.shipModel; //The plane model
    RawModel model = texturedModel.getRawModel();
    GL30.glBindVertexArray(model.getVaoID());
    GL20.glEnableVertexAttribArray(0);
    GL20.glEnableVertexAttribArray(1);
    GL20.glEnableVertexAttribArray(2);
    GL13.glActiveTexture(GL13.GL_TEXTURE0);
        GL11.glBindTexture(GL11.GL_TEXTURE_2D, TerrainDemo.shipModel.getTexture().getID()); //The ID of the texture.

    glDrawElements(GL_TRIANGLES, model.getVertexCount(), GL11.GL_UNSIGNED_INT, 0);

And, although I don't think they're important, my vertex and fragment shaders:

//Vertex shader
#version 130

in vec3 position;
in vec2 textureCoords;
in vec3 normal;

out vec2 pass_textureCoords;
out vec3 surfaceNormal;
out vec3 toLightVector;

uniform mat4 transformationMatrix;
uniform vec3 lightPosition;

void main(void){
gl_Position = ftransform();
pass_textureCoords = textureCoords;
surfaceNormal = (transformationMatrix * vec4(normal, 0.0)).xyz;
toLightVector = (vec3(500, 50000, 500)) - (transformationMatrix * vec4(position, 1.0)).xyz;
}

//Fragment shader
#version 130

in vec2 pass_textureCoords;
in vec3 surfaceNormal;
in vec3 toLightVector;

out vec4 out_Color;

uniform sampler2D textureSampler;

uniform vec3 lightColour;

void main(void){
vec3 unitNormal = normalize(surfaceNormal);
vec3 unitLightVector = normalize(toLightVector);

float nDot1 = dot(unitNormal, unitLightVector);
float brightness = max(nDot1, 0.2);
brightness = brightness + 0.5;
vec3 diffuse = brightness * vec3(1, 1, 1);

vec4 textureColor = vec4(diffuse, 1.0) * texture(textureSampler, pass_textureCoords);
if(textureColor.a<1){
    discard;
}
out_Color = vec4(textureColor.r, textureColor.g, textureColor.b, 1);

}

Again, I will stress that all of this is working perfectly if the game is running from the IDE. It is just if it runs from an external JAR that the issue occurs.

I will be experimenting with different texture loading techniques and methods (e.g. packing textures into the JAR) and seeing if anything different happens.

Yet another update: So, I sent the game to another person (They also use windows 8), and the game worked perfectly! No texturing errors whatsoever! So now I'm unsure if the problem is with my PC specifically or something else.

For those who wish to try, you can download the game at http://endcraft.net/PlaneGame and see it yourself (Please read the instructions.txt - Also, you'll need a program to decompress .rar files).

I will be getting as many people as I know to give the game a go and see if they have the same issue or if the texturing is correct.

It is completely baffling me that it works fine when I run it from the IDE, but does not work when I export into an external jar, but does work when I export it into an external jar and send it to someone else!

(another) Update: I have sent the game to multiple people, some of them are coming across this crash:

# A fatal error has been detected by the Java Runtime Environment:
#
#  EXCEPTION_ACCESS_VIOLATION (0xc0000005) at pc=0x1cd6b57f, pid=36828,  tid=35556
#
# JRE version: Java(TM) SE Runtime Environment (8.0_31-b13) (build 1.8.0_31-b13)
# Java VM: Java HotSpot(TM) Client VM (25.31-b07 mixed mode windows-x86 )
# Problematic frame:
# C  [atioglxx.dll+0xc0b57f]
#

In the log file, I see this:

Stack: [0x011d0000,0x01220000],  sp=0x0121f1e8,  free space=316k
Native frames: (J=compiled Java code, j=interpreted, Vv=VM code, C=native code)
C  [atioglxx.dll+0xc0b57f]
C  [atioglxx.dll+0xbe8b95]
C  [atioglxx.dll+0x641852]
C  [lwjgl.dll+0x9a28]
j  org.lwjgl.opengl.GL20.glUniformMatrix4(IZLjava/nio/FloatBuffer;)V+33
j  TMLoading.ShaderProgram.loadMatrix(ILorg/lwjgl/util/vector/Matrix4f;)V+23j
TMLoading.StaticShader.loadTransformationMatrix(Lorg/lwjgl/util/vector/Matrix4f;) V+6
j  Joehot200.Plane.render()V+407
j  Joehot200.TerrainDemo.render()V+4045
j  Joehot200.TerrainDemo.enterGameLoop()V+356
j  Joehot200.TerrainDemo.startGame()V+320
j  StartScreenExperiments.Test2.resartTDemo()V+128
j  StartScreenExperiments.Test2.main([Ljava/lang/String;)V+27
v  ~StubRoutines::call_stub
V  [jvm.dll+0x1473e5]

In other words, the entire issue (the texturing and the crashes) are beginning to look a lot like they are related to the shaders (either parsing information to them, or the actual shader code itself).

I have also done more testing, and the texturing works fine without shaders using a DisplayList.

Here is the code up to the glUniformMatrix() call:

//Some unnecessary code has been removed. For example, you do not care about what colour I make the plane depending on what team it is on.    

//Plane class. (Referring to a jet plane which is the main object the player controls)
public void render(){
    twodcoords = TextDemo.getScreenCoords(sx, sy + 30, sz);
    glPushAttrib(GL_ENABLE_BIT);
    glBindBuffer(GL_ARRAY_BUFFER, 0);
    glPushMatrix();
    glTranslatef(sx, sy, sz);
    GL30.glBindVertexArray(Main.TerrainDemo.vaoID);

    glEnableClientState(GL_VERTEX_ARRAY);
    glEnableClientState(GL_NORMAL_ARRAY);
        glRotatef(srot, 0f, 1f, 0f);
        glRotatef(pitch, -1f, 0f, 0f);
    Main.TerrainDemo.shader.start();
    glPushMatrix();
    glDisable(GL_LIGHTING);
    TexturedModel texturedModel = TerrainDemo.shipModel;
    RawModel model = texturedModel.getRawModel();
    GL30.glBindVertexArray(model.getVaoID());
    GL20.glEnableVertexAttribArray(0);
    GL20.glEnableVertexAttribArray(1);
    GL20.glEnableVertexAttribArray(2);
    GL13.glActiveTexture(GL13.GL_TEXTURE0);
        GL11.glBindTexture(GL11.GL_TEXTURE_2D, TerrainDemo.shipModel.getTexture().getID());
    org.lwjgl.util.vector.Matrix4f m = Assist.createTransformationMatrix(new Vector3f(sx, sy, sz), new Vector3f(pitch, rot, roll), new Vector3f(5, 5, 5)); //Creates a transformation matrix based on the X, Y, Z, Pitch, yaw, roll, and scale of the plane.
    Main.TerrainDemo.shader.loadTransformationMatrix(m);
    glDrawElements(GL_TRIANGLES, model.getVertexCount(), GL11.GL_UNSIGNED_INT, 0);
    GL20.glDisableVertexAttribArray(0);
    GL20.glDisableVertexAttribArray(1);
    GL20.glDisableVertexAttribArray(2);
    GL30.glBindVertexArray(0);
    GL30.glBindVertexArray(0);
    glPopMatrix();
    glPopAttrib();
}
//StaticShader class. This method literally just passes it to the loadMatrix class.    
public void loadTransformationMatrix(Matrix4f matrix){
    super.loadMatrix(location_transformationMatrix, matrix);
}
//ShaderProgram class.
FloatBuffer buf = BufferUtils.createFloatBuffer(4 * 4);
public void loadMatrix(int location, Matrix4f matrix){
    matrix.store(buf);
    buf.flip();
    GL20.glUniformMatrix4(location, false, buf);
}

Update - So, with one hour left on the Bounty, I thought I'd add a few more details:

  • As I probably said somewhere above, the game works for some when exported, but not for others. I've noticed that the game has always worked when ran on java 7, but with me only me and one other tester on java 7, this really isn't conclusive.
  • The texturing renders correctly in a DisplayList. The textures are loading. However, they are not being displayed correctly.
  • Even if you don't know the problem, trying out the game (ignore the start screen, I need to make a new one) and telling me the results, as well as your OS/Java details, etc, would be really appreciated.
  • Yes, the mipmapping is correct. I know someone in the comments mentioned it possibly wasn't, but I've tried setting it stupidly high and I do indeed get a very blurred texture.
  • I've already tried "package libraries into external jar". I appreciate the time taken for that answer, but I did say in the comments that I've already tried it.
  • The issue may be the fragment shader (as someone suggested in the comments), but I am currently unsure how to test it, nor do I understand why it would work inside the IDE but not outside of it.

Answer:

I can't test this, so I'm not sure if this will help, but some implementations of OpenGL don't save element buffers in VAOs. So try binding the element buffer before the call to glDrawElements, e.g.

glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, someBuffer);
glDrawElements(GL_TRIANGLES, model.getVertexCount(), GL11.GL_UNSIGNED_INT, 0);

If you don't do this, you may be using the last bound buffer for all draw calls. Probably not the answer, but it couldn't hurt to test it. I also noticed that you are using an older version of Slick, as you still have the slick-util, which is not part of the latest release, so you could also try updating that.

Question:

I just recently updated my Libgdx project from 1.4.x to 1.6.1. I use BitmapFontCache for my dialogue in my game, drawing a string character by character using BitmapFontCache.draw(start, end). This was working fine in 1.4.x but after making the necessary changes to get 1.6.1 to build, it seems to cause a crash when wrapping is enabled after the last character is displayed. Strangely this does not seem to be a problem with one line strings.

Here is how I add my text:

fontCache.addText( message, fontPosX, fontPosY, fontWidth, Align.left, true);

Then I increment the character count and draw. currentCharacter stops when reaching the end of the string based on its length:

fontCache.draw( batch, 0, currentCharacter );

This worked fine in 1.4.x even with multi-line wrapped strings but seems to cause an out of bounds exception if the lines wraps to a second line (crashes after drawing the last character). Here is the line causing crash in SpriteBatch.

System.arraycopy(spriteVertices, offset, vertices, idx, copyCount);

Is there a new way I need to be calculating the length of the string for drawing? Do I need to use the return GlyphLayout in some way? Or is this perhaps a bug?


Answer:

OK, I know where the issue lies, and I'm pretty certain it's a bug in libgdx.

I also have a workaround, although it's a little hacky.

The Problem When GlyphLayout wraps a line on a space character, it optimises out the terminating space. So with the space removed, the total number of glyphs in the layout is now less than the number of characters in the string. The more lines that get wrapped on a space character, the bigger the discrepency will be between the two.

The Workaround In order to work out what length to use for rendering the full text therefore, we need to count the number of glyphs in the GlyphLayout instead of the number of characters in the String. Here's some code that does that...

private int calcLength(GlyphLayout glyphLayout) {

    int length = 0;
    for(GlyphLayout.GlyphRun run : glyphLayout.runs) {
        length += run.glyphs.size;
    }
    return length;
}

The GlyphLayout to pass in will be the one that was returned by the BitmapFontCache.addText() method.

Question:

I am using LWJGL to make my new game.

Quick Question:

When my game crashes with something like this:

Exception in thread "main" java.lang.NullPointerException at engineTester.MainGameLoop.main(MainGameLoop.java:376) C:\Users\Kojo Ofori\AppData\Local\NetBeans\Cache\8.1\executor-snippets\run.xml:53: Java returned: 1

How do I get it to print it in dialog so that when someone runs it and it crashes they will be able to see it as a dialog that is has crashed

Hope it made sense


Answer:

I recommend using an UncaughtHandler to catch any exceptions that aren't handled and would actually crash the game. You weren't specific as to which version of LWJGL you are using, but if you are using LWJGL 2 it has a Sys class that lets you display an alert.

Sys.alert(Title, Message)

Unfortunately LWJGL 3 removed this class and GLFW doesn't seem to support creating message boxes at this time.

Set UncaughtHandler (First line in Main method)

Thread.setDefaultUncaughtExceptionHandler(UncaughtHandler.instance);

My UncaughtHandler class:

public class UncaughtHandler implements Thread.UncaughtExceptionHandler{
    public static UncaughtHandler instance = new UncaughtHandler();

    @Override
    public void uncaughtException(Thread t, Throwable e) {
        Sys.alert("Uncaught Exception!", e.getMessage());
    }
}

Question:

I'm using lwjgl 3 and learning modern opengl (3). I want to send a uniform matrix to the vertex shader so i can apply transformations. I tried that and the program crashes with this error

#
# A fatal error has been detected by the Java Runtime Environment:
#
#  EXCEPTION_ACCESS_VIOLATION (0xc0000005) at pc=0x0000000073a9820d, pid=8644, tid=2760
#
# JRE version: Java(TM) SE Runtime Environment (8.0_31-b13) (build 1.8.0_31-b13)
# Java VM: Java HotSpot(TM) 64-Bit Server VM (25.31-b07 mixed mode windows-amd64 compressed oops)
# Problematic frame:
# C  [nvoglv64.DLL+0xd5820d]
#
# Failed to write core dump. Minidumps are not enabled by default on client versions of Windows
#
# An error report file with more information is saved as:
# E:\Copy\Code\Personal\atei-graphics\GraphicsProject\hs_err_pid8644.log
#
# If you would like to submit a bug report, please visit:
#   http://bugreport.java.com/bugreport/crash.jsp
# The crash happened outside the Java Virtual Machine in native code.
# See problematic frame for where to report the bug.
#

Clearly I'm doing something wrong.

The problem seems to be in this line of code

    glUniformMatrix4(uniformMatrixLocation, false, mvp.getBuffer());

The program executes without error if i remove this line of code.

I tried passing a diagonal matrix to check if the problem was with the matrix itself and still i got the same result

mvp is the diagonal matrix i am passing to the shader. uniformMatrixLocation hold the position i found with this line of code

    glGetUniformLocation(shaderProgram.programId, "MVP");

which doesn't return a negative number, so probably there are no error here.

I am using this library for the Mat4 class https://github.com/jroyalty/jglm

Bellow is a "working" example of my code as small as i can get it.

    //glfw create windows and etc

    int programId = glCreateProgram();

    int vertexShaderId = glCreateShader(GL_VERTEX_SHADER);
    glShaderSource(vertexShaderId, FileUtils.readFile("shaders/vertex.glsl"));
    glCompileShader(vertexShaderId);
    if (glGetShaderi(vertexShaderId, GL_COMPILE_STATUS) == GL_FALSE) {
        throw new RuntimeException("Error creating vertex shader\n"
                + glGetShaderInfoLog(vertexShaderId, glGetShaderi(vertexShaderId, GL_INFO_LOG_LENGTH)));
    }
    glAttachShader(programId, vertexShaderId);
    int fragmentShaderId = glCreateShader(GL_FRAGMENT_SHADER);
    glShaderSource(fragmentShaderId, FileUtils.readFile("shaders/fragment.glsl"));
    glCompileShader(fragmentShaderId);
    if (glGetShaderi(fragmentShaderId, GL_COMPILE_STATUS) == GL_FALSE) {
        throw new RuntimeException("Error creating vertex shader\n"
                + glGetShaderInfoLog(fragmentShaderId, glGetShaderi(fragmentShaderId, GL_INFO_LOG_LENGTH)));
    }
    glAttachShader(programId, fragmentShaderId);
    glLinkProgram(programId);
    if (glGetProgrami(programId, GL_LINK_STATUS) == GL_FALSE) {
        throw new RuntimeException("Unable to link shader program:");
    }

    /*
        Hold the location of the matrix in the shader
    */
    int uniformMatrixLocation = glGetUniformLocation(programId, "MVP");


    float[] vertices = {
        +0.0f, +0.8f, 0,
        -0.8f, -0.8f, 0,
        +0.8f, -0.8f, 0
    };

    int vaoId = glGenVertexArrays();
    glBindVertexArray(vaoId);

    FloatBuffer verticesBuffer = BufferUtils.createFloatBuffer(vertices.length);
    verticesBuffer.put(vertices).flip();

    int vboId = glGenBuffers();
    glBindBuffer(GL_ARRAY_BUFFER, vboId);
    glBufferData(GL_ARRAY_BUFFER, verticesBuffer ,GL_STATIC_DRAW);

    glVertexAttribPointer(0, 3, GL_FLOAT, false, 0, 0);
    glBindVertexArray(0);

    glClearColor(0,0,0,1);         
    while (glfwWindowShouldClose(window) == GL_FALSE) {
        glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); // clear the framebuffer
        glUseProgram(programId);

        Mat4 mvp = new Mat4(1.0f);//diagonal matrix , should just show a triangle on the screen

        /*
            Crashes here
        */
        //glUniformMatrix4(uniformMatrixLocation, false, mvp.getBuffer());

        glBindVertexArray(vaoId);
        glEnableVertexAttribArray(0);

        glDrawArrays(GL_TRIANGLES, 0, 3);

        glDisableVertexAttribArray(0);
        glBindVertexArray(0);

        glUseProgram(0);

        glfwPollEvents();
        glfwSwapBuffers(window); // swap the color buffers
    }

    //dispose stuff no point showing them here

Vertex shader

    #version 330 core


    layout(location = 0) in vec3 position;

    uniform mat4 MVP;

    void main(){

         gl_Position =  MVP*vec4(position,1);

    }

Fragment shader

    #version 330 core

    out vec4 color;

    void main()
    {

         color = vec4(1,1,1,1);

    }

Excuse me if this has been asked before, i search the web and didn't find anything useful. Thank you in advance.


Answer:

The crash is most likely happening because you are passing LWJGL an on-heap float buffer returned from Mat4.getBuffer(). JWJGL requires that you pass it direct buffers:

LWJGL requires that all NIO buffers passed to it are direct buffers. Direct buffers essentially wrap an address that points to off-heap memory, i.e. a native pointer. This is the only way LWJGL can safely pass data from Java code to native code, and vice-versa, without a performance penalty. It does not support on-heap Java arrays (or plain NIO buffers, which wrap them) because arrays may be moved around in memory by the JVM's garbage collector while native code is accessing them. In addition, Java arrays have an unspecified layout, i.e. they are not necessarily contiguous in memory.

You can use the BufferUtils class like you are already using for your vertices:

FloatBuffer matrixBuffer = BufferUtils.createFloatBuffer(16).put(mvp.getBuffer()).flip();

...

glUniformMatrix4(uniformMatrixLocation, false, matrixBuffer);

If that doesn't work, try this instead:

FloatBuffer matrixBuffer = BufferUtils.createFloatBuffer(16).put(new float[] {
    1.0f, 0.0f, 0.0f, 0.0f,
    0.0f, 1.0f, 0.0f, 0.0f,
    0.0f, 0.0f, 1.0f, 0.0f,
    0.0f, 0.0f, 0.0f, 1.0f}).flip();

Question:

I'm working on creating a 2D Java game with a lighting engine with OpenGL using LWJGL, but I've hit a wall when trying to link up keyboard inputs.

The render loop works fine, but as soon as I tried to implement a JFrame/canvas and the getParent/KeyListener combo The application crashes immediately after starting up. I have to shut the application down in netbeans - the window doesn't respond to right clicking application's entry in the start toolbar.

public static void main(String[] args) {
  Main main = new Main();
  main.run();

}

public void run() {
  initialize();
  //animLoop();
}

private void initialize() {
  try {
    Frame theFrame = new Frame("Inlight");
    Canvas theCanvas = new Canvas();
    theCanvas.setMinimumSize(new Dimension(windowWidth, windowHeight));
    theFrame.setSize(windowWidth, windowHeight);
    theCanvas.requestFocusInWindow();

    theFrame.add(theCanvas);

    theFrame.setVisible(true);
    //before doing the following, I need to create the canvas within which openGL does it's rendering 
    //create it before applying keylistener

    Display.setDisplayMode(new DisplayMode(windowWidth, windowHeight));
    Display.setParent(theCanvas);
    Display.getParent().addKeyListener(new InlightKeyListener());
    Display.create(new PixelFormat(0, 16, 1));

  } catch (Exception e) {
    e.printStackTrace();
  }

  shaderProgram = glCreateProgram();
  fragmentShader = glCreateShader(GL_FRAGMENT_SHADER);
  StringBuilder fragmentShaderSource = new StringBuilder();

  try {
    String line;
    BufferedReader reader = new BufferedReader(new FileReader("src/shader.frag"));
    //points to the shader doc
    while ((line = reader.readLine()) != null) {
      fragmentShaderSource.append(line).append("\n");
    }
  } catch (IOException e) {}

  glShaderSource(fragmentShader, fragmentShaderSource);
  glCompileShader(fragmentShader);
  if (glGetShaderi(fragmentShader, GL_COMPILE_STATUS) == GL_FALSE) {
    System.err.println("Fragment shader not compiled!");
  }

  glAttachShader(shaderProgram, fragmentShader);
  glLinkProgram(shaderProgram);
  glValidateProgram(shaderProgram);


  glMatrixMode(GL_PROJECTION);
  glLoadIdentity();
  glOrtho(0, windowWidth, windowHeight, 0, 1, -1);
  glMatrixMode(GL_MODELVIEW);

  glEnable(GL_STENCIL_TEST);
  glClearColor(0, 0, 0, 0);
  System.out.println("Done initialize");
}

public synchronized void animLoop() {
  //This method will loop the render
  long startTime = System.currentTimeMillis();
  //sets the starting time to the current time
  long curTime = startTime;
  //The current time measurement, so at thestart the curTime = starting time  
  while (curTime - startTime < 1800) {
    long timePassed = System.currentTimeMillis() - curTime;
    //Makes the timePassed variable equal to the System's current time - the last measured current time.
    curTime += timePassed;
    //updates the measurement of the current time to the actual current time. (I imagine some small amount to time is lost while it is updated. this is negligible.)
    organiseTitle();
    //sets up the ojects to display the title screen for 1800 milliseconds
    render();
    //draws the new screen scene to the display
    clearObj();
    //cleans up the items built for that last render
  }

  while (!Display.isCloseRequested()) {
    long timePassed = System.currentTimeMillis() - curTime;
    //Makes the timePassed variable equal to the System's current time - the last measured current time.
    curTime += timePassed;
    //updates the measurement of the current time to the actual current time. (I imagine some small amount to time is lost while it is updated. this is negligible.)
    Organiselevel1(20, 200);
    render();
    //draws the new screen scene to the display
    clearObj();
    //cleans up the items built for that last render
  }
  glDeleteShader(fragmentShader);
  glDeleteProgram(shaderProgram);
  Display.destroy();
  //closes the window
}

//There's more code after this point of course, but it's all already been tested and works.


Answer:

Oh god I only just realized the call to my main loop was commented out.

/facepalm

Question:

even if this question was ask multiple times (i readed all of that and no solution worked for me), I am trying to model a rectangle with LWJGL and OpenGL , but it crashes every time. Here my PC-Stats:

AMD-Ryzen 1600x | MSI Nvidia GTX 1060 (6GB) | MSI x370 Carbon Pro Motherboard

I also tried this on an Intel-Setup, with an i7 Processor and a Nvidia Quadro K 1000M setup, but same Error you can see in the following:

https://hastebin.com/ayiqiritov.makefile

My Drawing Method:

public void render(RawModel model){
    GL30.glBindVertexArray(model.getVaoID());
    GL20.glEnableVertexAttribArray(0);
    GL11.glDrawArrays(GL11.GL_TRIANGLES, 0, model.getVertexCount());
    GL20.glDisableVertexAttribArray(0);
    GL30.glBindVertexArray(0);
}

In this class I create VAOs and the VBOs and store data into those:

    private List<Integer> vaos = new ArrayList<Integer>();
private List<Integer> vbos = new ArrayList<Integer>();

public RawModel loadToVAO(float[] positions) {
    int vaoID = createVAO();
    storeDataInAttributeList(0, positions);
    unbindVAO();
    return new RawModel(vaoID, positions.length / 3);
}

public void cleanUp() {
    for (int vao : vaos) {
        GL30.glDeleteVertexArrays(vao);
    }
    for (int vbo : vbos) {
        GL15.glDeleteBuffers(vbo);
    }
}

private int createVAO() {
    int vaoID = GL30.glGenVertexArrays();
    vaos.add(vaoID);
    GL30.glBindVertexArray(vaoID);
    return vaoID;
}

private void storeDataInAttributeList(int attributeNumber, float[] data) {
    int vboID = GL15.glGenBuffers();
    vbos.add(vboID);
    GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, vboID);
    FloatBuffer buffer = storeDataInFloatBuffer(data);
    GL15.glBufferData(GL15.GL_ARRAY_BUFFER, buffer, GL15.GL_STATIC_DRAW);
    GL20.glVertexAttribPointer(attributeNumber, 3, GL11.GL_FLOAT, false, 0, 0);
    GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, 0);
}

private void unbindVAO() {
    GL30.glBindVertexArray(0);
}

private FloatBuffer storeDataInFloatBuffer(float[] data) {
    FloatBuffer buffer = BufferUtils.createFloatBuffer(data.length);
    buffer.put(data).position(0);
    buffer.flip();
    return buffer;
}

And my main Method:

        public static void main(String[] args){
    if(!glfwInit()){
        throw new IllegalStateException("Failed");
    }

    System.out.println(GL11.glGetString(GL11.GL_VERSION));

    glfwWindowHint(GLFW_VISIBLE, GLFW_FALSE);

    GLFW.glfwWindowHint(GLFW.GLFW_CONTEXT_VERSION_MINOR, 3);
    GLFW.glfwWindowHint(GLFW.GLFW_CONTEXT_VERSION_MAJOR, 4);

    long window = GLFW.glfwCreateWindow(640, 480, "Hello World", 0, 0);

    if(window == 0){
        throw new IllegalStateException("Failed to create Window");
    }

    GLFWVidMode vidmode = glfwGetVideoMode(glfwGetPrimaryMonitor());
    glfwSetWindowPos(window, (vidmode.width() - 640) / 2, (vidmode.height() - 480) / 2);

    glfwShowWindow(window);

    Loader loader = new Loader();
    Renderer renderer = new Renderer();

    float[] vertices = {
        -0.5f, 0.5f, 0f,
        -0.5f, -0.5f, 0f,
        0.5f, -0.5f, 0f,

        0.5f, -0.5f, 0f,
        0.5f, 0.5f, 0f,
        -0.5f, 0.5f, 0f
    };

    RawModel model = loader.loadToVAO(vertices);

    while(!glfwWindowShouldClose(window)){
        renderer.prepare();
        renderer.render(model);
        glfwPollEvents();
    }

    loader.cleanUp();
    GLFW.glfwTerminate();

}

So I have already tried:

Update drivers for Graphic-card, update java, update Windows, setting up a new eclipse, reinstall java and deleting .metadata in eclipse.

Can anyone pls help me?


Answer:

According to the comment

I dont have implemented a shader yet

The state of the art way of rendering in OpenGL, would be to use a Shader.

If you don't use a shader, than you have to define the array of vertex data by glVertexPointer. glVertexPointer specifies a array for Fixed-Function vertex coordinate attribute. If you don't have a shader program, then you have to use the Fixed Function Pipeline.

private void storeDataInAttributeList(int attributeNumber, float[] data) {
    int vboID = GL15.glGenBuffers();
    vbos.add(vboID);
    GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, vboID);
    FloatBuffer buffer = storeDataInFloatBuffer(data);
    GL15.glBufferData(GL15.GL_ARRAY_BUFFER, buffer, GL15.GL_STATIC_DRAW);

    GL11.glVertexPointer( 3, GL11.GL_FLOAT, 0, 0 ); // <---------

    GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, 0);
}

Further you have to enable the client-side capability for vertex coordinates by glEnableClientState( GL_VERTEX_ARRAY ):

public void render(RawModel model){
    GL30.glBindVertexArray(model.getVaoID());

    GL11.glEnableClientState( GL11.GL_VERTEX_ARRAY );   // <---------

    GL11.glDrawArrays(GL11.GL_TRIANGLES, 0, model.getVertexCount());

    GL11.glDisableClientState( GL11.GL_VERTEX_ARRAY );  // <---------

    GL30.glBindVertexArray(0);
}

Further note, that you have to creates the "GLCapabilities" instance and makes the OpenGL bindings available for use before you use an OpenGL instruction like GL30.glGenVertexArrays() and you have to ensure the the OpenGL context is current.

Call glfwMakeContextCurrent(window) and GL.createCapabilities() after creating the window and before any OpenGL instruction:

long window = GLFW.glfwCreateWindow(640, 480, "Hello World", 0, 0);
if(window == 0){
    throw new IllegalStateException("Failed to create Window");
}

GLFWVidMode vidmode = glfwGetVideoMode(glfwGetPrimaryMonitor());
glfwSetWindowPos(window, (vidmode.width() - 640) / 2, (vidmode.height() - 480) / 2);

glfwMakeContextCurrent(window);  // <-----

glfwShowWindow(window);

GL.createCapabilities();         // <-----

.....

Finally you are missing glfwSwapBuffers int the render loop. glfwSwapBuffers Swaps the front and back buffers of the specified window. In other, simple words it binges the buffer where you rendered to, onto the screen:

while(!glfwWindowShouldClose(window)){
    renderer.prepare();
    renderer.render(model);

    glfwSwapBuffers(window); // <-----

    glfwPollEvents();
}

See also LWJGL 3 Guide - Getting Started

Question:

I have "-XstartOnFirstThread" in my VM arguments, however I am still getting the error message:

Exception in thread "Thread-0" java.lang.ExceptionInInitializerError
    at org.lwjgl.glfw.GLFW.glfwCreateWindow(GLFW.java:1248)
    at Main.init(Main.java:33)
    at Main.run(Main.java:56)
    at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.IllegalStateException: GLFW windows may only be created on the main thread.
    at org.lwjgl.glfw.EventLoop$OffScreen.<clinit>(EventLoop.java:39)
    ... 4 more

My Code:

import static org.lwjgl.glfw.GLFW.*;
import static org.lwjgl.opengl.GL11.*;
import static org.lwjgl.system.MemoryUtil.*;

import org.lwjgl.glfw.*;


public class Main implements Runnable {

    private Thread thread;
    private boolean running;

    public long window;

    public static void main(String[] args) {
        Main game = new Main();
        game.start();
    }

    public void start() {
        running = true;
        thread = new Thread(this);
        thread.start();
    }

    public void init() {
        if(glfwInit() != GL_TRUE) {
            System.err.println("GLFW Initialization Failed!");
        }

        glfwWindowHint(GLFW_RESIZABLE, GL_TRUE);

        window = glfwCreateWindow(800, 600, "test", NULL, NULL);

        if(window == NULL) {
            System.err.println("Could not create our window!");
        }

        GLFWVidMode vidmode = glfwGetVideoMode(glfwGetPrimaryMonitor());
        glfwSetWindowPos(window, 100, 100);

        glfwMakeContextCurrent(window);

        glfwShowWindow(window);
    }

    public void update() {
        glfwPollEvents();
    }

    public void render() {
        glfwSwapBuffers(window);
    }

    public void run() {
        init();
        while(running) {
            update();
            render();

            if(glfwWindowShouldClose(window) == GL_TRUE) {
                running = false;
            }
        }
    }

}

I am asking this question here because I have looked around and have not seen a solution posted for this anywhere else. Thank you for your help!


Answer:

Right now you are starting your program in the main thread, but immediately creating a new thread that creates the window. In LWJGL, you should do all GLFW calls and OpenGL rendering in the main thread. You can use other threads to build VBOs, load textures, calculate physics, etc.

Question:

I have a problem with my little game. I'm new in LJWGL3 (used 2 for a long time) and want to implement a Key to switch between window and fullscreen mode. I can already do this without big problems. But now my game will crash if I often press this key. Sometimes later or earlier. No timed crash. I try so much (3 hours) and now have run out of ideas. Anybody know what I am doing wrong? I can't post the complete source code because it's to much. I post down here some important parts of code. First here is the error:

Exception in thread "main" org.lwjgl.system.libffi.ClosureError: Callback failed because the closure instance has been garbage collected.
at org.lwjgl.system.JNI.invokeIIPPPP(Native Method)
at org.lwjgl.glfw.GLFW.nglfwCreateWindow(GLFW.java:1146)
at org.lwjgl.glfw.GLFW.glfwCreateWindow(GLFW.java:1227)
at com.dungeon.gl.FullscreenCreation.setFullScreen(FullscreenCreation.java:63)
at com.dungeon.gl.GLAction.updateGL(GLAction.java:78)
at com.dungeon.MainAction.start(MainAction.java:69)
at com.dungeon.MainAction.main(MainAction.java:21)

Here is the code i use to check if the KEY down:

    if(glfwGetKey(appID, GLFW_KEY_F) == GLFW_PRESS) {
        FullscreenCreation.setFullScreen(!fullscreen);
    } 

Here is my FullscreenCreation.java

    public static void setFullScreen(boolean fullScreen) {

    //i don't have copy the variables like fullscreen to here
    if (fullscreen == fullScreen)
        return;
    fullscreen = fullScreen; 

    if (fullScreen) {
        windowWIDTH = WIDTH;
        windowHEIGHT = HEIGHT;

        //get monitor resolution
        ByteBuffer vidMode = glfwGetVideoMode(glfwGetPrimaryMonitor());
        WIDTH = GLFWvidmode.width(vidMode);
        HEIGHT = GLFWvidmode.height(vidMode);
    } else {
        WIDTH = windowWIDTH;
        HEIGHT = windowHEIGHT;
    }       

    //create new window THIS IS THE POINT WHERE ITS CRASH AFTER A WHILE
    long display = glfwCreateWindow(WIDTH, HEIGHT, MainAction.BASIC_WINDOW_TITLE, fullScreen ? glfwGetPrimaryMonitor() : 0, appID);
    glfwDestroyWindow(appID); //kill the window (for now)

    //add new callback for resizing the window
    glfwSetCallback(display, GLFWWindowSizeCallback(new SAM() {
        @Override
        public void invoke(long window, int width, int height) {
            if(width!=0 && height!=0) {
            WIDTH=width;
            HEIGHT=height;
            isResize=true;
            }
        }
    }));


    //now apply the new fullscreen window
    appID = display;

    //some GL stuff for the new window
    glfwMakeContextCurrent(appID);        
    GL.createCapabilities();

    glfwSwapInterval(0); //disable vsync

    //only call glEnable(GL_TEXTURE_2D), GL_BLEND ...
    DisplayCreation.enableGLStuff();

    isResize = true;
}

If you need more code I will post more.


Answer:

You need to keep a strong reference on every callback you create, because the garbage collector doesn't know it is used in native code and thus collects it otherwise.

So this means you have to store the callback object in a field.

private static GLFWWindowSizeCallback sizeCallback;

//...

public static void setFullScreen(boolean fullScreen) {
    //...
    sizeCallback = GLFWWindowSizeCallback(new SAM() {
        //...
    };
    glfwSetCallback(display, sizeCallback);
    //...
}

Then, when you are done using the callback (i.e. when you destroy the window), you just have to release it.

sizeCallback.release();

Note: To release all callbacks of a given window, you can also use Callbacks.glfwReleaseCallbacks.


For differences between LWJGL 2 and 3 you can have a look at the migration guide, which also contains a brief explanation of this problem.

Question:

I tried to build the example mod which comes with the 1.12.2 MDK, gradlew setupDecompWorkspace, gradlew eclipse and even gradlew build work fine, only when I try to run gradlew runClient it crashes immediately and prints the following error/stack trace:

[21:02:20] [main/ERROR] [LaunchWrapper]: Unable to launch
java.lang.reflect.InvocationTargetException: null
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_242]
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_242]
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_242]
        at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_242]
        at net.minecraft.launchwrapper.Launch.launch(Launch.java:135) [launchwrapper-1.12.jar:?]
        at net.minecraft.launchwrapper.Launch.main(Launch.java:28) [launchwrapper-1.12.jar:?]
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_242]
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_242]
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_242]
        at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_242]
        at net.minecraftforge.gradle.GradleStartCommon.launch(GradleStartCommon.java:97) [start/:?]
        at GradleStart.main(GradleStart.java:25) [start/:?]
Caused by: java.lang.ExceptionInInitializerError
        at org.lwjgl.LinuxSysImplementation.<clinit>(LinuxSysImplementation.java:50) ~[lwjgl-2.9.4-nightly-20150209.jar:?]
        at org.lwjgl.Sys.createImplementation(Sys.java:131) ~[lwjgl-2.9.4-nightly-20150209.jar:?]
        at org.lwjgl.Sys.<clinit>(Sys.java:116) ~[lwjgl-2.9.4-nightly-20150209.jar:?]
        at net.minecraft.client.Minecraft.getSystemTime(Minecraft.java:3159) ~[Minecraft.class:?]
        at net.minecraft.client.main.Main.main(Main.java:42) ~[Main.class:?]
        ... 12 more
Exception in thread "main" Caused by: java.lang.NullPointerException
        at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1847) ~[?:1.8.0_242]
        at java.lang.Runtime.loadLibrary0(Runtime.java:871) ~[?:1.8.0_242]
        at java.lang.System.loadLibrary(System.java:1124) ~[?:1.8.0_242]
        at java.awt.Toolkit$3.run(Toolkit.java:1636) ~[?:1.8.0_242]
        at java.awt.Toolkit$3.run(Toolkit.java:1634) ~[?:1.8.0_242]
        at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_242]
        at java.awt.Toolkit.loadLibraries(Toolkit.java:1633) ~[?:1.8.0_242]
        at java.awt.Toolkit.<clinit>(Toolkit.java:1670) ~[?:1.8.0_242]
        at org.lwjgl.LinuxSysImplementation.<clinit>(LinuxSysImplementation.java:50) ~[lwjgl-2.9.4-nightly-20150209.jar:?]
        at org.lwjgl.Sys.createImplementation(Sys.java:131) ~[lwjgl-2.9.4-nightly-20150209.jar:?]
        at org.lwjgl.Sys.<clinit>(Sys.java:116) ~[lwjgl-2.9.4-nightly-20150209.jar:?]
        at net.minecraft.client.Minecraft.getSystemTime(Minecraft.java:3159) ~[Minecraft.class:?]
        at net.minecraft.client.main.Main.main(Main.java:42) ~[Main.class:?]
        ... 12 more
[21:02:20] [main/INFO] [STDERR]: [java.lang.ThreadGroup:uncaughtException:1052]: java.lang.reflect.InvocationTargetException
[21:02:20] [main/INFO] [STDERR]: [java.lang.ThreadGroup:uncaughtException:1052]:        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[21:02:20] [main/INFO] [STDERR]: [java.lang.ThreadGroup:uncaughtException:1052]:        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[21:02:20] [main/INFO] [STDERR]: [java.lang.ThreadGroup:uncaughtException:1052]:        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[21:02:20] [main/INFO] [STDERR]: [java.lang.ThreadGroup:uncaughtException:1052]:        at java.lang.reflect.Method.invoke(Method.java:498)
[21:02:20] [main/INFO] [STDERR]: [java.lang.ThreadGroup:uncaughtException:1052]:        at net.minecraftforge.gradle.GradleStartCommon.launch(GradleStartCommon.java:97)
[21:02:20] [main/INFO] [STDERR]: [java.lang.ThreadGroup:uncaughtException:1052]:        at GradleStart.main(GradleStart.java:25)
[21:02:20] [main/INFO] [STDERR]: [java.lang.ThreadGroup:uncaughtException:1061]: Caused by: net.minecraftforge.fml.relauncher.FMLSecurityManager$ExitTrappedException
[21:02:20] [main/INFO] [STDERR]: [java.lang.ThreadGroup:uncaughtException:1061]:        at net.minecraftforge.fml.relauncher.FMLSecurityManager.checkPermission(FMLSecurityManager.java:49)
[21:02:20] [main/INFO] [STDERR]: [java.lang.ThreadGroup:uncaughtException:1061]:        at java.lang.SecurityManager.checkExit(SecurityManager.java:761)
[21:02:20] [main/INFO] [STDERR]: [java.lang.ThreadGroup:uncaughtException:1061]:        at java.lang.Runtime.exit(Runtime.java:108)
[21:02:20] [main/INFO] [STDERR]: [java.lang.ThreadGroup:uncaughtException:1061]:        at java.lang.System.exit(System.java:973)
[21:02:20] [main/INFO] [STDERR]: [java.lang.ThreadGroup:uncaughtException:1061]:        at net.minecraft.launchwrapper.Launch.launch(Launch.java:138)
[21:02:20] [main/INFO] [STDERR]: [java.lang.ThreadGroup:uncaughtException:1061]:        at net.minecraft.launchwrapper.Launch.main(Launch.java:28)
[21:02:20] [main/INFO] [STDERR]: [java.lang.ThreadGroup:uncaughtException:1061]:        ... 6 more

System Information:

  • OS: Ubuntu 18.04
  • Forge MDK: forge-1.12.2-14.23.5.2847-mdk
  • Graphics Driver: nvidia-driver-435
  • Java Version: 1.8.0_242

However I can still run (modded) Minecraft from the launcher without having this error.


Answer:

I had this same issue. This question seemed to be the same problem, so I tried the fix it gave - downgrade from java 1.8_242 to 1.8_232. It worked, although it was difficult to find the right java version (it was on adoptopenjdk.net). Try running gradle with java 1.8_232.

Question:

I started learning how to use LWJGL library and wrote a simple display program, but it just crashes right away. I'm using Linux Mint

Exception in thread "main" java.lang.ExceptionInInitializerError
    at DisplayExample.start(DisplayExample.java:10)
    at DisplayExample.main(DisplayExample.java:29)
Caused by: java.lang.ArrayIndexOutOfBoundsException: Index 0 out of bounds for length 0
    at org.lwjgl.opengl.LinuxDisplay.getAvailableDisplayModes(LinuxDisplay.java:951)
    at org.lwjgl.opengl.LinuxDisplay.init(LinuxDisplay.java:738)
    at org.lwjgl.opengl.Display.<clinit>(Display.java:138)
    ... 2 more

Process finished with exit code 1

Here's the code I wrote:

import org.lwjgl.LWJGLException;
import org.lwjgl.opengl.Display;
import org.lwjgl.opengl.DisplayMode;

import java.io.File;

public class DisplayExample {
    public void start(){
        try{
            Display.setDisplayMode(new DisplayMode(800,600));
            Display.create();
        }
        catch(LWJGLException e){
            e.printStackTrace();
            System.exit(0);
        }

        while(!Display.isCloseRequested()){
            Display.update();
        }
        Display.destroy();
    }

    public static void main(String[] args){
        System.setProperty("org.lwjgl.librarypath", System.getProperty("user.dir") + File.separator + "linux");
        DisplayExample d = new DisplayExample();
        d.start();
    }
}

thanks in advance for replies!


Here's the output of xrandr -q

Screen 0: minimum 320 x 200, current 1920 x 1080, maximum 8192 x 8192
eDP-1 connected primary 1920x1080+0+0 (normal left inverted right x axis y axis) 345mm x 194mm
   1920x1080     60.02*+  60.01    59.97    59.96    59.93  
   1680x1050     59.95    59.88  
   1600x1024     60.17  
   1400x1050     59.98  
   1600x900      59.99    59.94    59.95    59.82  
   1280x1024     60.02  
   1440x900      59.89  
   1400x900      59.96    59.88  
   1280x960      60.00  
   1440x810      60.00    59.97  
   1368x768      59.88    59.85  
   1360x768      59.80    59.96  
   1280x800      59.99    59.97    59.81    59.91  
   1152x864      60.00  
   1280x720      60.00    59.99    59.86    59.74  
   1024x768      60.04    60.00  
   960x720       60.00  
   928x696       60.05  
   896x672       60.01  
   1024x576      59.95    59.96    59.90    59.82  
   960x600       59.93    60.00  
   960x540       59.96    59.99    59.63    59.82  
   800x600       60.00    60.32    56.25  
   840x525       60.01    59.88  
   864x486       59.92    59.57  
   800x512       60.17  
   700x525       59.98  
   800x450       59.95    59.82  
   640x512       60.02  
   720x450       59.89  
   700x450       59.96    59.88  
   640x480       60.00    59.94  
   720x405       59.51    58.99  
   684x384       59.88    59.85  
   680x384       59.80    59.96  
   640x400       59.88    59.98  
   576x432       60.06  
   640x360       59.86    59.83    59.84    59.32  
   512x384       60.00  
   512x288       60.00    59.92  
   480x270       59.63    59.82  
   400x300       60.32    56.34  
   432x243       59.92    59.57  
   320x240       60.05  
   360x202       59.51    59.13  
   320x180       59.84    59.32  
HDMI-1 disconnected (normal left inverted right x axis y axis)
DP-1 disconnected (normal left inverted right x axis y axis)

Answer:

LWJGL uses xrandr under the hood to get the screen resolutions by parsing xrandr -q command output.

Check that xrandr is installed and xrandr -q returns valid output on the machine that runs your Java code. Then check your logs for Exception in XRandR.populate() message which will tell you why LWJGL failed to get the resolutions.

Question:

Whenever I run the following code java returns an EXCEPTION_ACCESS_VIOLATION when freeing the IntBuffer:

public int[] size(){
    IntBuffer size=BufferUtils.createIntBuffer(2);
    long address=MemoryUtil.memAddress(size);

    GLFW.nglfwGetWindowSize(this.handle, address, address+Integer.BYTES);
    int[] result=new int[]{size.get(0), size.get(1)};

    JEmalloc.nje_free(address);

    return result;
}

Pastebin


Answer:

The ByteBuffer created by BufferUtils will be released by the garbage collector when there's no more references to it.

If you want to use JEmalloc.nje_free() to free the buffer, then you also need to use JEmalloc when allocating the buffer.

IntBuffer size = JEmalloc.je_malloc(2 * Integer.BYTES).asIntBuffer();
long address = MemoryUtil.memAddress(size);

[...]

JEmalloc.nje_free(address);

Question:

I changed my code from DisplayLists to VBOs / VAOs. But when i run the application it crashes.

It crashes when at the first attempt to draw the VAOs. I'm drawing using no shaders (so it does not cause problems).

There are up to 12 faces (12 * 3 vertices) in one VAO and texture coordinates for those. There are up to 500 000 VAOs.

How i create a face:

tData.add(new float[]{textureX + 0.1249f, textureY+ 0.1249f});
vData.add(new float[]{x, y, z});
tData.add(new float[]{textureX+ 0.1249f, textureY+0.0001f});
vData.add(new float[]{x, y+1, z+1});
tData.add(new float[]{textureX+0.0001f, textureY+0.0001f});
vData.add(new float[]{x+1, y+1, z+1});

Creating the VBO/VAO:

if(vData.isEmpty())
        return;

    int vaoHandle = glGenVertexArrays();
    glBindVertexArray(vaoHandle);

    int vertexDataSize = vData.size() * 3;
    int textureDataSize = tData.size() * 2;
    FloatBuffer vertexData = BufferUtils.createFloatBuffer(vData.size() * 3);
    FloatBuffer textureData = BufferUtils.createFloatBuffer(tData.size() * 2);

    while(!vData.isEmpty())
    {
        vertexData.put(vData.remove(0));
    }
    while(!tData.isEmpty())
    {
        textureData.put(tData.remove(0));
    }

    vertexData.flip();
    textureData.flip();

    int vertexHandle = glGenBuffers();
    glBindBuffer(GL_ARRAY_BUFFER, vertexHandle);
    glBufferData(GL_ARRAY_BUFFER, vertexData, GL_STATIC_DRAW);
    glVertexAttribPointer(0, vertexDataSize, GL_FLOAT, false, 0, 0);

    int textureHandle = glGenBuffers();
    glBindBuffer(GL_ARRAY_BUFFER, textureHandle);
    glBufferData(GL_ARRAY_BUFFER, textureData, GL_STATIC_DRAW);
    glVertexAttribPointer(1, textureDataSize, GL_FLOAT, false, 0, 0);


    glBindBuffer(GL_ARRAY_BUFFER, 0);
    glBindVertexArray(0);

    renderEngine.vaos.add(new VAO(vaoHandle, (int)((float)vertexDataSize / 3f)));

Rendering the VBO / VAO:

    glUseProgram(0);//TODO:REMOVE
    for(VAO vao : vaos)
    {
        glBindVertexArray(vao.getHandle());
        glEnableVertexAttribArray(0);
        glEnableVertexAttribArray(1);

        System.out.println(vao.getVertices());//correct numeber
        glDrawArrays(GL_TRIANGLES, 0, vao.getVertices());//<-- CRASH at first time called

        glDisableVertexAttribArray(1);
        glDisableVertexAttribArray(0);
    }
    glBindVertexArray(0);

Here's the error:

# A fatal error has been detected by the Java Runtime Environment:
#
#  EXCEPTION_ACCESS_VIOLATION (0xc0000005) at pc=0x000000000b3fb610, pid=7736, tid=6224
#
# JRE version: Java(TM) SE Runtime Environment (8.0_20-b26) (build 1.8.0_20-b26)
# Java VM: Java HotSpot(TM) 64-Bit Server VM (25.20-b23 mixed mode windows-amd64 compressed oops)
# Problematic frame:
# C  [ig75icd64.dll+0x8b610]
#
# Failed to write core dump. Minidumps are not enabled by default on client versions of Windows
#
# An error report file with more information is saved as:
# C:\Users\Geosearchef\workspaces\workspaceLWJGL\OrangeJuiceVBO\hs_err_pid7736.log
#
# If you would like to submit a bug report, please visit:
#   http://bugreport.sun.com/bugreport/crash.jsp
# The crash happened outside the Java Virtual Machine in native code.
# See problematic frame for where to report the bug.
#

I don't think it's sensefull to post the whole error here. Do you have any idea why this is happening? I can't find anything about this error concerning VBOs.


Answer:

You do not set the vertex attribute pointers correctly:

int vertexDataSize = vData.size() * 3;
int textureDataSize = tData.size() * 2;
[...]
glVertexAttribPointer(0, vertexDataSize, GL_FLOAT, false, 0, 0);
[...]
glVertexAttribPointer(1, textureDataSize, GL_FLOAT, false, 0, 0);

The size parameter defines the number of elements in the vector of each vertex, and must be in the range of 1 to 4.

YOur code will just generate a GL error - and you should definitively add some error checks, at least for debugging - and leaves the attribute pointer uninitialized.

Another issue here: you use generic vertex attributes 0 and 1, but you don't use shaders. That is not going to work. The spec only guarantees that attribute index 0 will map to the classic glVertex attribute, but attribute 1 might be anything, or not work at all.

Question:

My program is crashing due to an EXCEPTION_ACCESS_VIOLATION in the file ig7icd64.dll file.

I'm creating a simple project with the LWJGL library and some external jars found here: https://github.com/CodingAP/LWJGL-3-Tutorial.git (just using slick-util3) and when loading textures from resource files ('.png') the program crashes.

I have many classes which are using a lot of GLXX.gl[function_name_here], but I will explain why I do not list all of them.

I create a window with a Window class which just sets up a GLFW-context and works just fine.

I have a Model class which gets extended by UntexturedModel as well as TexturedModel. These set up VertexArrays and VertexBuffers and function normally. I even have a Shader class which reads two shader files and applies them with no errors.

public class UntexturedModel extends Model {

private int vertexArrayID, vertexBufferID, indicesBufferID, vertexCount;

public UntexturedModel(float[] vertices, int[] indices) {
    vertexArrayID = super.createVertexArray();
    indicesBufferID = super.bindIndicesBuffer(indices);
    vertexBufferID = super.storeData(0, 3, vertices);
    vertexCount = indices.length;
    GL30.glBindVertexArray(0);
}

public void destroy () {
    GL30.glDeleteVertexArrays(vertexArrayID);
    GL15.glDeleteBuffers(vertexBufferID);
    GL15.glDeleteBuffers(indicesBufferID);
}

// I have not included the getters for the IDs due to the space available
}

This extends the Model class which follows:

public class Model {

protected int createVertexArray() {
    int vertexArrayID = GL30.glGenVertexArrays();
    GL30.glBindVertexArray(vertexArrayID);
    return vertexArrayID;
}

protected int storeData (int attributeNumber, int coordSize, float[] data) {

    FloatBuffer buffer = BufferUtils.createFloatBuffer(data.length);
    buffer.put(data);
    buffer.flip();

    int bufferID = GL15.glGenBuffers();
    GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, bufferID);
    GL15.glBufferData(GL15.GL_ARRAY_BUFFER, buffer, GL15.GL_STATIC_DRAW);

    GL20.glVertexAttribPointer(0, 3, GL11.GL_FLOAT, false, 0, 0);

    GL15.glBindBuffer(GL15.GL_ARRAY_BUFFER, 0);

    return bufferID;
}

protected int bindIndicesBuffer (int[] indices) {

    IntBuffer buffer = BufferUtils.createIntBuffer(indices.length);
    buffer.put(indices);
    buffer.flip();

    int bufferID = GL15.glGenBuffers();
    GL15.glBindBuffer(GL15.GL_ELEMENT_ARRAY_BUFFER, bufferID);
    GL15.glBufferData(GL15.GL_ELEMENT_ARRAY_BUFFER, buffer, GL15.GL_STATIC_DRAW);
    GL15.glBindBuffer(GL15.GL_ELEMENT_ARRAY_BUFFER, 0);

    return bufferID;
}
}

The TexturedModel is in essence the same as UntexturedModel, but has an additional float[] textureCoords which gets added using storeData(1, 2, textureCoords); It also has an attribute Material which uses the external jars:

public class Material {

private int textureID;

public Material (String file) {
    try {

        textureID = TextureLoader.getTexture("png", new FileInputStream("res/" + file)).getTextureID();
        // Possible error location: TextureLoader is an external jar

    } catch (IOException e) {
        System.err.println("Error: Couldn't load texture");
        System.exit(-1);
    }
}

public void destroy () {
    GL11.glDeleteTextures(textureID);
}
// I'm ignoring getters once again
}

Using the UntexturedModel class works fine, even with shader files. I will include the BasicShader and Shader classes below:

public abstract class Shader {

private int vertexShaderID, fragmentShaderID, programID;
private String vertexFile, fragmentFile;

public Shader (String vertexFile, String fragmentFile) {
    this.vertexFile = vertexFile;
    this.fragmentFile = fragmentFile;
}

public void create () {
    programID = GL20.glCreateProgram();

    vertexShaderID = GL20.glCreateShader(GL20.GL_VERTEX_SHADER);

    GL20.glShaderSource(vertexShaderID, readFile(vertexFile));
    GL20.glCompileShader(vertexShaderID);

    if (GL20.glGetShaderi(vertexShaderID, GL20.GL_COMPILE_STATUS) == GL11.GL_FALSE) {
        System.err.println("Error: Vertex Shader - " + GL20.glGetShaderInfoLog(vertexShaderID));
    }

    fragmentShaderID = GL20.glCreateShader(GL20.GL_FRAGMENT_SHADER);

    GL20.glShaderSource(fragmentShaderID, readFile(fragmentFile));
    GL20.glCompileShader(fragmentShaderID);

    if (GL20.glGetShaderi(fragmentShaderID, GL20.GL_COMPILE_STATUS) == GL11.GL_FALSE) {
        System.err.println("Error: Fragment Shader - " + GL20.glGetShaderInfoLog(fragmentShaderID));
    }

    GL20.glAttachShader(programID, vertexShaderID);
    GL20.glAttachShader(programID, fragmentShaderID);

    GL20.glLinkProgram(programID);

    if (GL20.glGetProgrami(programID, GL20.GL_LINK_STATUS) == GL11.GL_FALSE) {
        System.err.println("Error: Program Linking - " + GL20.glGetShaderInfoLog(programID));
    }

    GL20.glValidateProgram(programID);

    if (GL20.glGetProgrami(programID, GL20.GL_VALIDATE_STATUS) == GL11.GL_FALSE) {
        System.err.println("Error: Program Validation - " + GL20.glGetShaderInfoLog(programID));
    }
}

public abstract void bindAllAttributes();

public void bindAttribute (int index, String location) {
    GL20.glBindAttribLocation(programID, index, location);
}

public void bind () {
    GL20.glUseProgram(programID);
}

public void destroy () {
    GL20.glDetachShader(programID, vertexShaderID);
    GL20.glDetachShader(programID, fragmentShaderID);

    GL20.glDeleteShader(vertexShaderID);
    GL20.glDeleteShader(fragmentShaderID);

    GL20.glDeleteProgram(programID);
}

private String readFile (String path) {
    BufferedReader reader;
    StringBuilder builder = new StringBuilder();

    try {

        reader = new BufferedReader(new FileReader(path));

        String line = reader.readLine();

        while (line != null) {

            builder.append(line + '\n');
            line = reader.readLine();
        }

    } catch (IOException e) {
        System.err.println("Error: Exception while reading from file");
    }

    return builder.toString();
}
}

public class BasicShader extends Shader {

private static final String VERTEX_FILE = ".\\src\\shaders\\basicVertexShader.vs";
private static final String FRAGMENT_FILE = ".\\src\\shaders\\basicFragmentShader.fs";

public BasicShader() {
    super(VERTEX_FILE, FRAGMENT_FILE);
}

@Override
public void bindAllAttributes () {
    super.bindAttribute(0, "position");
    super.bindAttribute(1, "textCoords");
}
}

I've tested the shader files (basicVertexShader.vs and basicFragmentShader.fs) and they work as intended.

I have tried the following: Seeing if the .dll file was deleted. Re-installing Java (jdk and jre included) Re-installing Eclipse Updating Graohics Driver to the by Intel suggested verion

I'm using Windows 10 and a Lenovo Thinkpad.

If any additional information is needed, please ask bellow.

Update:

Stack: [0x0000000002b10000,0x0000000002c10000], sp=0x0000000002c0bf40, free space=1007k Native frames: (J=compiled Java code, j=interpreted, Vv=VM code, C=native code)

C [ig7icd64.dll+0x933b0]

C [ig7icd64.dll+0x17b4b2]

C [ig7icd64.dll+0x215514]

C [ig7icd64.dll+0x6d1ee]

C [ig7icd64.dll+0x243745]

C [ig7icd64.dll+0x92555]

C [ig7icd64.dll+0x2a3af8]

C [ig7icd64.dll+0x2a3e09]

C [ig7icd64.dll+0x2a57ba]

C 0x0000000002d88c67

Java frames: (J=compiled Java code, j=interpreted, Vv=VM code)

j org.lwjgl.opengl.GL11C.nglDrawElements(IIIJ)V+0

j org.lwjgl.opengl.GL11C.glDrawElements(IIIJ)V+4

j org.lwjgl.opengl.GL11.glDrawElements(IIIJ)V+4

j render.Renderer.renderTexturedModel(Lrender/TexturedModel;)V+43 (Renderer is a class which just executes the glDrawElements() function; nothing special)

j main.Main.main([Ljava/lang/String;)V+191

v ~StubRoutines::call_stub

Update: I solved the issue. It lies in Model.storeData(). I don't use attributeNumber or coordSize. CLOSED


Answer:

[CLOSED] As you can see in my last edit, my error lies in the Model class. I don't actually use the index and size parameters in the storeData() function which means I'm trying to store the arrays in the same list so the program crashed. Thank you to everyone.

Question:

When I render to an FBO and blit this FBO to the window and the window is minimized Java throws an EXCEPTION_ACCESS_VIOLATION exception.

This is my code for blitting to the screen and to my understanding, what it does.

//Bind the draw framebuffer to the default (0) 
GL30.glBindFramebuffer(GL30.GL_DRAW_FRAMEBUFFER, 0); 
//Bind the read framebuffer to the fbo id
GL30.glBindFramebuffer(GL30.GL_READ_FRAMEBUFFER, frameBufferID); 
//Setting the draw buffer to the screen
GL11.glDrawBuffer(GL11.GL_BACK); 
//Settiing the read buffer to the color attachment of the fbo
GL11.glReadBuffer(GL30.GL_COLOR_ATTACHMENT0);
if(GL30.glCheckFramebufferStatus(GL30.GL_FRAMEBUFFER) == GL30.GL_FRAMEBUFFER_COMPLETE) //Checking if the framebuffer is complete
{ 
  //Blitting the frambuffer to the screen
  GL30.glBlitFramebuffer(0, 0, fboWidth, fboHeight, 0, 0, windowWidth, windowHeight, GL11.GL_COLOR_BUFFER_BIT, GL11.GL_NEAREST); 
}
//Unbinding the framebuffer
GL30.glBindFramebuffer(GL30.GL_FRAMEBUFFER, 0); 

To prevent the crash I added a check to check if the window is minimized

if(isWindowIconified) return;
//Bind the draw framebuffer to the default (0) 
GL30.glBindFramebuffer(GL30.GL_DRAW_FRAMEBUFFER, 0); 
//Bind the read framebuffer to the fbo
GL30.glBindFramebuffer(GL30.GL_READ_FRAMEBUFFER, frameBufferID); 
//Setting the draw buffer to the screen
GL11.glDrawBuffer(GL11.GL_BACK);
//Settiing the read buffer to the color attachment of the fbo
GL11.glReadBuffer(GL30.GL_COLOR_ATTACHMENT0); 
if(GL30.glCheckFramebufferStatus(GL30.GL_FRAMEBUFFER) == GL30.GL_FRAMEBUFFER_COMPLETE)
{ 
  //Blitting the frambuffer to the screen
  GL30.glBlitFramebuffer(0, 0, fboWidth, fboHeight, 0, 0, windowWidth, windowHeight, GL11.GL_COLOR_BUFFER_BIT, GL11.GL_NEAREST); 
}
//Unbinding the framebuffer
GL30.glBindFramebuffer(GL30.GL_FRAMEBUFFER, 0);

I also added this bit of code when initializing the window

//Setting the window minimisazion callback
glfwSetWindowIconifyCallback(window, new GLFWWindowIconifyCallbackI() { 
  @Override
  public void invoke(long window, boolean iconified) {
    isWindowIconified = iconified;
  }
});

Now the program doesn't crash when I minimize it, but when I press Windows+D to get to the desktop the program still crashes.

Now to my questions: What is the best way to prevent the crash? Why does this happen?

Files:

System Information:

OS: Windows 10 Home, Version 10.0.15063

GPU: Intel HD Graphics 520

Driver Version: 20.19.15.4642

OpenGL Version: 4.4


Answer:

If Java throws an EXCEPTION_ACCESS_VIOLATION exception and it happend in ig9icd64.dll it's probably because you use old Intel GPU drivers. I this case it seems to be the problem because it doesn't happen when I launch the program with my NVIDIA GPU and it didn't crash after I updated my Intel GPU driver.

If you are not sure where the error happened look for "Native frames:" and "Java frames:" in the crash report. The topmost line shows where the error happend.

Question:

Lately whenever I run my OpenGL game I get this strange error:

I am on OS X.

What does this error mean?

#
# A fatal error has been detected by the Java Runtime Environment:
#
#  SIGSEGV (0xb) at pc=0x00000001276eff9f, pid=461, tid=5379
#
# JRE version: Java(TM) SE Runtime Environment (8.0_11-b12) (build 1.8.0_11-b12)
# Java VM: Java HotSpot(TM) 64-Bit Server VM (25.11-b03 mixed mode bsd-amd64 compressed oops)
# Problematic frame:
# C  [GLEngine+0x15bf9f]  gleRunVertexSubmitImmediate+0x27ef
#
# Failed to write core dump. Core dumps have been disabled. To enable core dumping, try "ulimit -c unlimited" before starting Java again
#
# An error report file with more information is saved as:
# /Users/amit/workspace/Evox Voxel Engine GL 2.1/hs_err_pid461.log
#
# If you would like to submit a bug report, please visit:
#   http://bugreport.sun.com/bugreport/crash.jsp
# The crash happened outside the Java Virtual Machine in native code.
# See problematic frame for where to report the bug.
#

This error is generated whenever I draw:

            GL11.glDrawElements(GL11.GL_TRIANGLES, indicesCount,
                    GL11.GL_UNSIGNED_INT, 0);

Answer:

This was my problem:

I was using a global Indices VBO for all chunks because I figured it wouldn't matter if I draw a bit more indices than I would actually have. Turns out this causes some kind of buffer overflow or reading/writing to an invalid address that causes the entire game to crash.

So the lesson is always calculate EXACTLY the amount of indices that you have for EACH object and then supply that to the draw method, otherwise it could cause weird behaviour.

Also another possible problem would be that your indices VBO is limited by the type of it, for example byte/short/int. This could cause your game to crash if your indices go over the amount that a byte/short/int can hold.

Happy programming.