Age | Commit message (Collapse) | Author |
|
As per Radeon 9700 Opengl Programming and Optimization Guide [1], there are
16 texture units even on the first r300 chipsets. If you think I am wrong,
feel free to propose a patch.
[1] Here's PDF: http://people.freedesktop.org/~mareko/
|
|
|
|
|
|
|
|
Fixes Mac OS X SCons build.
|
|
Fixes Mac OS X SCons build.
|
|
Fixes Mac OS X SCons build.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
The demo uses a Pixmap as its drawing area, and whatever is drawn on the
pixmap will be used as a texture to draw a cube.
|
|
Use st_manager::get_egl_image to look up GLeglImageOES and implement
EGLImageTargetTexture2D and EGLImageTargetRenderbufferStorage.
|
|
This hook may be used by rendering state trackers to implement EGLImage
extensions.
|
|
We should use pitch for the overriden state, fixes one half of the tfp test.
Signed-off-by: Dave Airlie <airlied@redhat.com>
|
|
|
|
In case anyone needs it, it's here.
|
|
It makes life for some code browsing utilites easier.
|
|
The setup needs be done after querying tiling flags.
|
|
|
|
This helps debugging on darwin.
Signed-off-by: Jeremy Huddleston <jeremyhu@apple.com>
|
|
The tiling setup needs a bit of work, but this should be good enough for now,
when we get buffers from the kernel we need to store their tiling properties.
Signed-off-by: Dave Airlie <airlied@redhat.com>
|
|
|
|
|
|
|
|
|
|
|
|
|
|
print the output target in the FP debug.
Signed-off-by: Dave Airlie <airlied@redhat.com>
|
|
support.
Not all is bad, but I'm afraid I'll have to throw the baby with the water
given they are all tied to together.
|
|
|
|
there's no good way of aligning the output's, and since the vertex_header
is variable sized in the first place we need to extract elements from a vector
and store them individually into an array. this gets the basic examples working
again
|
|
|
|
|
|
From fglrx traces the dithering is never enabled.
Signed-off-by: Dave Airlie <airlied@redhat.com>
|
|
With an Intel 855GM handled by intel_drv, there's a crash with Gallium3D
enabled DRI driver for Intel i915 (--enable-gallium-intel).
The Gallium3D driver doesn't support the 855GM as expected by
intel_drv, it failed to open the screen and give an half
initialized screen structure to dri_destroy_option_cache():
optionCache.info is NULL, so it's crashing while trying
to free array content. This patch at least fix the crash in the function.
Here's some logs of the fixed version:
[ 16274.137] LoaderOpen(/opt/mesa/lib/xorg/modules/drivers/intel_drv.so)
[ 16274.139] (II) Loading /opt/mesa/lib/xorg/modules/drivers/intel_drv.so
[ 16274.183] (II) Module intel: vendor="X.Org Foundation"
[ 16274.183] compiled for 1.8.0, module version = 2.11.0
[ 16274.183] Module class: X.Org Video Driver
[ 16274.183] ABI class: X.Org Video Driver, version 7.0
[ 16274.183] (II) intel: Driver for Intel Integrated Graphics Chipsets: i810,
i810-dc100, i810e, i815, i830M, 845G, 852GM/855GM, 865G, 915G,
E7221 (i915), 915GM, 945G, 945GM, 945GME, Pineview GM, Pineview G,
965G, G35, 965Q, 946GZ, 965GM, 965GME/GLE, G33, Q35, Q33, GM45,
4 Series, G45/G43, Q45/Q43, G41, B43, Clarkdale, Arrandale
[ 16274.382] (II) intel(0): Integrated Graphics Chipset: Intel(R) 855GME
[ 16274.382] (--) intel(0): Chipset: "852GM/855GM"
[ 16276.675] (II) intel(0): [DRI2] Setup complete
[ 16276.675] (II) intel(0): [DRI2] DRI driver: i915
debug_get_option: GALLIUM_TRACE = (null)
debug_get_bool_option: GALLIUM_RBUG = FALSE
debug_get_bool_option: INTEL_DUMP_CMD = FALSE
i915_create_screen: unknown pci id 0x3582, cannot create screen
dri_init_screen_helper: failed to create pipe_screen
[ 16276.794] (EE) AIGLX error: Calling driver entry point failed
[ 16276.794] (EE) AIGLX: reverting to software rendering
[ 16276.794] (II) AIGLX: Screen 0 is not DRI capable
[ 16276.796] (II) AIGLX: Loaded and initialized /opt/mesa/lib/dri/swrast_dri.so
[ 16276.796] (II) GLX: Initialized DRISWRAST GL provider for screen 0
Signed-off-by: Yann Droneaud <yann@droneaud.fr>
Reviewed-by: Corbin Simpson <MostAwesomeDude@gmail.com>
|
|
|
|
MAXWIDTH/HEIGHT were 2048 but the max texture size was 4096.
This caused a crash if a 4Kx4K texture was created and rendered to.
See comment about max framebuffer size in lp_scene.h.
Also added assertions to catch this inconsistancy in the future.
|
|
The code is correct. Tell Coverity that the fallthrough case is
intentional.
|
|
|
|
Add ifdef guards around variables of types defined only for
GLX_DIRECT_RENDERING.
|
|
|
|
|