Blender Git Loki

Blenderin Git "master"-kehityshaaran kommitit.

Page: 4823 / 5574

Revision 2fdc41c by Joshua Leung
November 26, 2008, 23:13 (GMT)
Added access for start/end frames in RNA.
November 26, 2008, 22:52 (GMT)

RNA

* Added support for auto generating enum properties that are defined as
bitflags mixed with other flags (use for shadow type in la->mode now).

November 26, 2008, 22:24 (GMT)

RNA

* Made auto collection wrapping work for ** arrays.
(Mesh.mats, still points to ID since there is no Material yet).
* Added sorting of RNA structs/properties before writing.

November 26, 2008, 17:47 (GMT)
VideoTexture: new ImageRender class for Render To Texture

The new class VideoTexture.ImageRender() is available to perform
render to texture in the GE.

Constructor:

VideoTexture.ImageRender(scene,cam)
cam : camera object that will be used for the render.
It must be an inactive camera.
scene: reference to the scene that will be rendered.
The camera must be part of that scene.
Returns an object that can be used as a source of a VideoTexture.Texture object

Methods: none

Attributes:

background:
4-tuple representing the background color of the rendering
as RGBA color components, each component being an integer
between 0 and 255.
Default value = [0,0,255,255] (=saturated blue)
Note: athough the alpha component can be specified, it is not
supported at the moment, the alpha channel of the rendered
texture will always be 255. You can however introduce an
alpha channel by appending a FilterBlueScreen() filter, it
will set the alpha to 0 (transparent) on all pixels that were
not rendered.

capsize:
2-tuple representing the size of the render area as [x,y] number of pixels.
Default value = largest rectangle with power of 2 dimensions that fits in the canvas
You may want to reduce the render area to increase performance. For example,
a render area of [256,128] is probably sufficient to implement a car inner mirror.
For best performance, use power of 2 dimensions and don't set any filter: this
allows direct transfer between the GPU frame buffer and texture memory
without going through the host.

alpha:
Boolean indicating if the render alpha channel should be copied to the texture.
Default value: False
Experimental, do not use.

whole:
Boolean indicating if the entire canvas should be used for the rendering.
Default value: False
Note: There is no reason to set this attribute to True: the rendering will
in any case be scaled down to the largest rectangle with power of 2
dimensions before transfering to the texture.

Attributes inherited from the ImageBase class:

image : image binary data, read-only
size : [x,y] size of the texture, read-only
scale : set to True for fast scale down in case the render area dimensions are not power of 2
flip : set to True for vertical flip.
filter: set a post-processing filter on the render.

Notes:

* Aspect Ratio
For consistent results in Blender and Blenderplayer, the same aspect ratio used
by Blender to draw the camera viewport (Scene(F10)->Format tab->Size X/Size Y)
is also used during the rendering. You can control the portion of the scene that
will be rendered by "looking through the camera": the zone inside the outer dotted
rectangle will be rendered to the texture.
In order to reproduce the scene without X/Y distortion, you must apply the texture
on an object or portion of object that has the same aspect ratio.

* Order of rendering
The rendereing is performed when you call the refresh() method of the parent
Texture object. This happens outside the normal frame rendering and will have no
effect on it.
However, if you want to use ImageViewport and ImageRender at the same time, be
sure to refresh the viewport texture before the render texture because the latter
will destroy the frame buffer that is used by the former to update the texture.

* Scene status
The meshes are not updated during the render to texture: the rendered texture
is one frame late to the rendered frame with regards to mesh deformation.

* Example:

cont = GameLogic.getCurrentController()
# object that receives the texture
obj = contr.getOwner()
scene = GameLogic.getCurrentScene()
# camera used for the render
tvcam = scene.getObjectList()['OBtvcam']
# assume obj has some faces UV assigned to tv.png
matID = VideoTexture.materialID(obj, 'IMtv.png')
GameLogic.tv = VideoTexture.Texture(obj, matID)
GameLogic.tv.source = VideoTexture.ImageRender(scene,tvcam)
GameLogic.tv.source.capsize = [256,256]
# to render the texture, just call GameLogic.tv.refresh(True) on each frame.

You can download a demo game (with a video file) here:

http://home.scarlet.be/~tsi46445/blender/VideoTextureDemo.zip

For those who have already downloaded the demo, you can just update the blend file:

http://home.scarlet.be/~tsi46445/blender/VideoTextureDemo.blend
November 26, 2008, 17:38 (GMT)
BGE update: helper functions for VideoTexture render to texture feature.
November 26, 2008, 17:35 (GMT)
BGE bug fix: culling algorithm didn't used the correct camera.
November 26, 2008, 17:34 (GMT)
MSVC9 project file update for TEX nodes
November 26, 2008, 17:23 (GMT)
MSVC9 project file update for blenderplayer
Revision 402fbd9 by Robin Allen
November 26, 2008, 13:07 (GMT)
2 things:

* Patch #17998
* tex_input_vec now takes 3-vector as first argument (was 4-vector).

Revision d6ba347 by Matt Ebb
November 26, 2008, 00:47 (GMT)
* Added a new Timeline menu entry: Playback -> Sync Playback to Frames/Sec. It's
more convenient and sensible to find this feature, than over in the audio settings
(dropping frames in playback is not just for audio!)
Revision 00fae51 by Matt Ebb
November 25, 2008, 23:14 (GMT)
* Added 'Particle System' entries as children of particle system modifiers in the
outliner. Clicking on one activates it and takes you to the object buttons - it's a
lot better to see what's going on than a little < 1 Part 5 > that gives very little
indication.
November 25, 2008, 19:23 (GMT)
Bringing back icons:
Part 3/3: new icons
- Icon set done by jendrzych! Great job!
- cleaned up unnecessary includes and removed commented out code
- preview icons (for materials, textures,..) don't work yet, have to be ported to new event system
November 25, 2008, 18:32 (GMT)
Bringing back icons:
- Part 2/3: renaming file according to new convention
November 25, 2008, 18:27 (GMT)
Bringing back icons:
- Step 1/3: move relevant files to branch
November 25, 2008, 12:53 (GMT)
* fix path to openexr on irix
* added missing define in storage.c
November 25, 2008, 08:00 (GMT)
* make sure Bullet is enabled when WITH_BF_BULLET=True
November 25, 2008, 03:39 (GMT)
use const where possible and removed some unneeded args (4.3% speedup)
November 24, 2008, 22:06 (GMT)
minor changes
* less calls to BKE_image_get_ibuf when switching between images
* remove thread debug prints
* use own pixel blending funcs (IMB_blend_color isnt needed in some cases), slightly faster without this.
November 24, 2008, 21:57 (GMT)
allow building without python
November 24, 2008, 19:24 (GMT)
Second step out of three for merging sketching subdivision methods with graph subdivision methods.

Graph subdivision uses graph arc iterators.
Tehnyt: Miika HämäläinenViimeksi päivitetty: 07.11.2014 14:18MiikaH:n Sivut a.k.a. MiikaHweb | 2003-2021