Jump to content

OnlyJoe

Members
  • Posts

    7
  • Joined

  • Last visited

Everything posted by OnlyJoe

  1. Pretty sure you have to do the animations of entities in the code. BlockBench will export the Java class for you, which is helpful. I am not familiar with MCreator, and whether you can import from BlockBench into it, I would imagine that they have their own animation format under the hood.
  2. In your material file you need to update the path for the .png file. It should be a path to a file in the textures directory. Which I see you have already copied the file there. I.E update the line in the .mtl file to: map_Kd witchery2:block/witchs_cauldron Also the path in the .json model must be a full path, from the root of your mod. (this is probably the main problem for you) So make the path: "model" : "witchery2:models/block/witchs_cauldron.obj" Adding the "models/" to the start.
  3. Did you work this out? From what I can tell, the main reason you aren't adding anything to the world, is that no code exists in your Piece to actually add blocks. Have a look at SwampHutPiece, or the IglooPieces (The later uses the TemplateStructurePiece)..
  4. That does look quite tidy. Can't think of any reason not to do that. The only thing to be aware of is that most of the registry methods rely heavily on java reflection, so you just have to avoid having totally isolated classes, or the class loader won't load them. Eg, a class that doesn't have some import chain back to your main class. That is why you have to declare your objects somewhere else, and couldn't just put a sort of "@RegisterObject" on your class definition.
  5. You can probably get away with just spawning your particles outside of the: if (!this.world.isRemote) { ... } As the impact is going to get fired for all the players who can see that entity anyway, using the logic to keep the entity in sync. Spawning on the server will send an SSpawnParticlePacket with world location, etc where the particles should go. If you call this every tick, you send that packet every tick, so it is always better to have the client side do the spawning when you know they will already be insync.
  6. Pretty sure this is where it is going wrong: IVertexBuilder vertexBuilder = renderTypeBuffer.getBuffer(RenderType.entityTranslucentCull(WATER)); This will return an ITEM style vertex buffer. But then when you add your vertex: vertexBuilder.pos(matrix4f, x, y, z).color(r / 16f, g / 16f, b / 16f, 1.0f) .tex(u, v).overlay(overlayUV).lightmap(lightmapUV).normal(matrix3f, 0.0F, 1.0F, 0.0F).endVertex(); This is in BLOCK style. The difference is at the end, ITEM style doesn't have .overlay() element. So you are then pushing extra bytes to the buffer which is moving everything along meaning you are writing the value of your lightmapUV over where it is expecting the normal values. And this is messing up the lighting calculations. These functions just push directly to a byte buffer in the background so their order and what is used is important. How to fix: Best option is probably to call vertexBuilder.vertex(...) and this will sort out what bits need to go where in the buffer depending on the buffer type. Otherwise if you are always going to render to an ITEM buffer, just remove the .overlay(overlayUV) section. A few other comments: Don't do this in the function to build your vertex: int color = Objects.requireNonNull(tileEntity.getWorld()).getBiome(tileEntity.getPos()).getWaterColor(); Do it somewhere before, and pass it along. Vertex functions should be very fast, as they tend to get called thousands of times every frame. And this function is doing a lot of lookups in the background to get the same value every time. (And I am assuming this is actually a copy of a minecraft function, but they are often written quite badly) This stuff is likely doing nothing: RenderSystem.enableBlend(); RenderSystem.color4f(1.0F, 1.0F, 1.0F, 1.0f); RenderSystem.defaultBlendFunc(); RenderSystem.disableTexture(); I can't totally tell the context you are in, but the way that minecraft renders the GUI and world are slightly different. If you are rendering something in the world, it works like this: Stage 1: Minecraft will call all the objects to build one (or groups of) giant vertex buffers (Minecraft uses QUADS as the render type, so it doesn't need an index and vertex buffer like most 3D stuff). - It is actually groups of vertex buffers, so you have your normal types "solid", "cutout", "translucent", you can find them all on the RenderType object. - IRenderTypeBuffer is often passed to render functions, this is the object that contains all these giant buffers. When you call: .getBuffer(RenderType.translucent()); You are just gaining access to the giant buffer of that type. - You then add your vertices to the end of the buffer, and this gets passed along to all the other objects that will do the same. - Each of these buffers have different setups for the pre and post calls to OpenGL, as well as vertex formats. So be careful with the order that is used when creating your vertex, in the builder (It depends on the sort of buffer you are using).. You can just call the .vertex() function on the builder and it puts them in the correct order for you. Stage 2: Rendering - After collecting all the vertices from the objects, the .finish() function is called which does the actual rendering. (Put a breakpoint in your render function and look at the call stack). - Each buffer type has an "enter" and "exit" function that is declared (can't remember exactly where). - The process is simple: Call the "enter" function which sets up the GL options, blend, alpha etc. Push the vertex buffer to graphics card. Call the exit function, which resets all the GL options used. - This works because Minecraft only uses one texture. But it does have some drawbacks: because they just use one buffer you need to transform vertices on the CPU before adding them to the buffer, this is why MatrixStack is passed around everywhere (But as they are such low poly models this probably isn't much of an issue). Also, they use QUADS, with quads the OpenGL driver has to convert them to triangles before pushing them to the card, so this will be happening every frame. However, again such low poly models that it probably doesn't really matter. So hopefully you can see, that when you call "RenderSystem" methods you are just setting the OpenGL options directly, which will then get overridden during the later render phase, and might not get turned off for the rending of other buffers. Worth noting that the GUI render is slightly different, here you can actually do direct drawing, using a Tessellator object and its .finish() call. And in that situation you do need to use "RenderSystem" to setup your options. This is because GUI's need to use a lot more textures, every dialog is typically it's own texture.
  7. Using .obj files seems to have changed a lot, I am assuming that 1.15 has removed the ForgeLoader for blockstates, as the multipart system really covers everything it did (except for loading .obj). The new way to load models: Your blockstate should use the vanilla method (without".obj" on the model name) e.g.: { "variants": { "": { "model": "MODID:block/MODEL" } } } Every model, in the "models/block" path, now needs, MODEL.json, MODEL.obj, MODEL.mtl. In your new MODEL.json file you need to tell it to use the OBJLoader from forge, and give the full resource path to your ".obj" file, eg: { "loader": "forge:obj", "model": "MODID:models/block/MODEL.obj", "flip-v": true } The options you can include in the new MODEL.json file are: { "loader": "forge:obj", //to inform the resource loader to use OBJLoader "model": <<resource path to .obj file>>, "detectCullableFaces": true|false, "diffuseLighting": true|false, "flip-v": true|false, "ambientToFullbright": true|false, "materialLibraryOverride": <<resource path to material library if you need different materials on the same .obj>>, } Remember to update the paths for your textures in the .mlt file to use resource paths (without ".png").
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.