Blender RGB > UV Texture

Going by this tip it looks like what is needed to get good texture.

Can anyone can give a better example on how to project the “project the RGB color per vertex to the OBJ with UVs” with Blender?

**But I have a tip for you **
Export the obj from Revo Scan after you meshing it when the RGB is still there.

Then create the textures and export the second obj with the textures.

Import both to Blender or Zbrush as I do and project the RGB color per vertex to the OBJ with UVs and you get new textures in seconds from the RGB .

This is the fastest way to copy the RGB to UVS.

Revo STudio and textures - #45 by PopUpTheVolume

Edit: Looks like this is in the wrong section, sorry, can move to help

Hi @revohorse

Textures improved so much already with POP2 , what scanner are you using ?
Make sure you have the last software update .

RGB data sometimes is much better , but you need to fuse it at the highest values and the same the mesh .
How more point how better the resolution will be .

There is not really other simplier way to transfer the RGB color per vertex .

In Zbrush I do create new model based on the scan with my own UV and transfer the textures and mesh details to it , in the process I don’t only get the albedo , but also displacement and normals . So my model is low resolution but have all the rest of the details in maps .

You can see my simple video here, how I am doing it in Zbrush .

1 Like

I’m using POP2.

When the mesh is done without texture step it’s much better then when you do the texture step it turns bad.

So I was under the impression I would be keeping the 2nd step somehow when you keep the unaltered rgb.obj.

I have seen your video today, about 5 minutes ago :slight_smile: , it’s good but it’s ZBrush. Someone must have done it here surely…



  1. Point Cloud

  1. Fused

  2. Meshed

  3. Texture (blurred? looks like cell shaded in a game?)

You need to setup the proper light to capture the textures , eliminate all shadows , the preview shading is not important , the collected color per vertex is .
Without the proper soft box lighting and brightness/ exposure you got what you got , so little extra work on your side is required.

Ok. This the second best lighting I could get.

The ultimate was in a grow tent, which is reflective with a fluorescent light, so the whole figure was lit. The trouble I have there is have to use a laptop and even though it’s fairly decent laptop it’s not a good as desktop process.

You need a light from all direction coming at the same intensity and diffused if you want to capture textures .
It is different when you capture just one picture , with multiple pictures per second all areas need to be lite up equal, something that you need little to work on here .
Laptops usually don’t have the power to very fast processing 3 scanning but it is not that bad when you scan small objects .

Looking at what you got , there is still lots of improvements to be done but I am sure you will make it works with little practice and tests .

I would recommend the light to come from the right side behind the scanner
As that what s where the RGB camera takes the pictures .

A small LED with a diffuser like they sell on Amazon or in revopoint store could be more useful, you have already the back lighting , so now only the front lighting , you can try also use phone light and put simple white paper over the led it would make the trick .
Then you need to set the RGB brightness on auto , and when it set up switch to Manual and start scanning .

Please play with it a bit , i am sure you will get it right . Just remember that the top or back light can’t be brighter than the front light .

You need a light from all direction coming at the same intensity and diffused if you want to capture textures .

Yes a grow tent works well, it’s the best I have had so far anyway.

Thanks for heads up on diffusing light and will try out some of the paper hacks.

These models will just be printed anyway, the texture isn’t a massive issue, they are just a bonus as I could use them in a game display, not as controllable characters and I thought there was a process better than the software.

1 Like