Hot questions for Using Lightweight Java Game Library in macos

Question:

I was developing a game in Java & OpenGL on Windows and yesterday I got a new Macbook Pro. When I moved my code from the Windows machine to the OS X machine I got an error that said couldn't compile shaders. I read on the internet that it's because in OS X you have to turn ProfileCore to true which is only supported in OpenGL 3.2 + . Here is my problem: For my GUI and my Text I use GL10 functions and I use the Slick2D library. When-ever I set my OpenGL version to be 3.2 with core profile the game runs fine but I get an error on Slick2D code.

  • I wanted to know if there was a way to run OpenGL with core profile but not 3.2
  • Is there a library for text like Slick2D that supports OpenGL 3.2 or will I have to create my own support of texture atlases and stuff for the text.

Answer:

I do not have enough reputation to comment, so I must write this as an answer.

The core profile is a configuration in OSX which sets the used OpenGL version. https:// developer.apple.com/graphicsimaging/opengl/capabilities/

Slick2D should support lower OpenGL Versions. I have used it with OpenGL version 2.1 on Linux and Windows. My Application also ran in OSX.

Try out core profile version 2.1. This is the version that was most compatible to PCs (and Macs) in my testing.

I do not fully understand your second question. Are you looking for an alternative engine? I can recommend libGDX for cross-platform game development with Java. (http://libgdx.badlogicgames.com/)

If you are open to non-Java Game Engines, you can use Unity3D of course. (http://unity3d.com/)

Question:

I have scoured the internet far and wide looking for answers to this problem. I realize that LWJGL3 is still in the works but no one else but me seems to be having this problem. So it goes like this, I have a Mac and a PC and I like to develop in Java and LWJGL because of how it supposed to be cross platform. However, the app runs just fine on the PC but when I run it on my Mac with debugging on I get this in the console:

[LWJGL] Version 3.0.0b build 35 | Mac OS X | x86_64
[LWJGL] Loaded library from java.library.path: lwjgl
[LWJGL] MemoryUtil accessor: MemoryAccessorUnsafe
[LWJGL] Loaded native library: lib/libjemalloc.dylib
[LWJGL] MemoryUtil allocator: JEmallocAllocator
[LWJGL] Loaded native library: lib/libglfw.dylib
[LWJGL] Loaded native library bundle: /System/Library/Frameworks/OpenGL.framework
[LWJGL] Failed to locate address for GL function glVertexArrayVertexAttribDivisorEXT

I believe the "Failed to locate..." happens during the GL.createCapabilities() call. As a result I think this makes the OpenGL initialization fail and just create a blank screen because that seems to be whats happening to me.

Here is the actual code:

        glfwSetErrorCallback(errorCallback = errorCallbackPrint(System.err));

        if (glfwInit() != GL11.GL_TRUE)
            throw new IllegalStateException("Unable to initialize GLFW");

        glfwDefaultWindowHints();
        glfwWindowHint(GLFW_VISIBLE, GL_FALSE);
        glfwWindowHint(GLFW_RESIZABLE, GL_FALSE);

        ByteBuffer vidmode = glfwGetVideoMode(glfwGetPrimaryMonitor());
        WIDTH = GLFWvidmode.width(vidmode);
        HEIGHT = GLFWvidmode.height(vidmode);
        RENDER_RATIO = (float) WIDTH / 1920.0f; // Scales all the rendering by a constant

        window = glfwCreateWindow(WIDTH, HEIGHT, "Game", glfwGetPrimaryMonitor(), NULL);
        if (window == NULL)
            throw new RuntimeException("Failed to create the GLFW window");

        // GLFW Callbacks
        glfwSetKeyCallback(window, keyCallback = new Keyboard());
        glfwSetCursorPosCallback(window, cursorPosCallback = new Mouse.CursorPos());
        glfwSetMouseButtonCallback(window, mouseButtonCallback = new Mouse.MouseButton());
        glfwSetScrollCallback(window, scrollCallback = new Mouse.Scroll());

        glfwMakeContextCurrent(window);
        glfwSwapInterval(1);
        glfwShowWindow(window);

        // Initialize OpenGL
        GL.createCapabilities(); //New for LWJGL 3.0.0b
        GL11.glMatrixMode(GL11.GL_PROJECTION);
        GL11.glLoadIdentity();
        GL11.glOrtho(0, WIDTH, HEIGHT, 0, 1, -1);
        GL11.glMatrixMode(GL11.GL_MODELVIEW);
        GL11.glEnable(GL11.GL_TEXTURE_2D);
        GL11.glClearColor(1f, 0f, 0f, 1.0f); //Red background

The window initializes but I would guess that the OpenGL context does not because the window is black when I set the background to red. Before that I was getting "not started on main thread" exceptions but they were fixed by adding -XstartOnFirstThread to my VM arguments. This is a multi-threaded game loop but I made sure to keep the rendering on the main thread and the updating on a new thread, however it doesn't even get that far because it's not initialing OpenGL properly. Also it does run but it only locks up when it get to the rendering calls, causing the rendering loop to freeze. Here are my VM arguments:

-XstartOnFirstThread -Djava.library.path=lib/ -Dorg.lwjgl.util.Debug=true

I really have had trouble finding any other posts with the same problem. I found some really old posts that said they were bugs so maybe this is as well. It could be something simple and I'm just being dumb, but if it helps I have included the entire class in the link below. Thanks in advance.

Full class: http://pastebin.com/eZ1qXPsd


Answer:

This is a function that was added to the ARB_instanced_arrays extension after it had already been implemented in OpenGL drivers. That means that some drivers may expose the extension even if that particular function is missing, so it's been made optional in LWJGL. This is from the extension spec:

7) How should EXT_direct_state_access interact with this extension?

Resolved: Add glVertexArrayVertexAttribDivisorEXT selector-free vertex array object command and glGetVertexArrayIntegeri_vEXT query must accept VERTEX_ATTRIB_ARRAY_DIVISOR_ARB to return the vertex array object's vertex attrib array divisor state.

The DSA interaction was added July 2013. If implementations respond to a wglGetProcAddress, etc. query for "glVertexArrayVertexAttribDivisorEXT" with a NULL pointer, the DSA functionality is not available.

The message:

Failed to locate address for GL function glVertexArrayVertexAttribDivisorEXT

is a simple warning in debug mode and can safely be ignored. If you need to use that function, you can check if it is available using the following code:

// get the extension instance for the current context
ARBInstancedArrays ext = ARBInstancedArrays.getInstance(); 
if ( ext.VertexArrayVertexAttribDivisorEXT != NULL ) {
    // the function is available
}

Question:

I am trying to develop a voxel rendering engine in Java with LWJGL 2.

I'm launching my application from Eclipse, and I've set the initial heap size to 1024M and the max heap size to 2048M in the "Run Configurations" menu.

When I look at the memory consumption of my program in Java VisualVM, it shows about 500-1000 MB of used heap to me.

Diagram of memory consumption in Java VisualVM

But, the Activity Monitor on Mac shows me that there are 20 GB of RAM used. What could be possible reasons for this enormous discrepancy?


Answer:

The heap size is only used by plain Java objects.

If you are using a native library like LWJQL you can allocate far more native memory.

These native data object often needs to be explicitly freed by calling a method on the object when you don't need it anymore.