CUDA invalid value error when using 2.15GB texture #70242
Labels
No Label
Interest
Alembic
Interest
Animation & Rigging
Interest
Asset Browser
Interest
Asset Browser Project Overview
Interest
Audio
Interest
Automated Testing
Interest
Blender Asset Bundle
Interest
BlendFile
Interest
Collada
Interest
Compatibility
Interest
Compositing
Interest
Core
Interest
Cycles
Interest
Dependency Graph
Interest
Development Management
Interest
EEVEE
Interest
EEVEE & Viewport
Interest
Freestyle
Interest
Geometry Nodes
Interest
Grease Pencil
Interest
ID Management
Interest
Images & Movies
Interest
Import Export
Interest
Line Art
Interest
Masking
Interest
Metal
Interest
Modeling
Interest
Modifiers
Interest
Motion Tracking
Interest
Nodes & Physics
Interest
OpenGL
Interest
Overlay
Interest
Overrides
Interest
Performance
Interest
Physics
Interest
Pipeline, Assets & IO
Interest
Platforms, Builds & Tests
Interest
Python API
Interest
Render & Cycles
Interest
Render Pipeline
Interest
Sculpt, Paint & Texture
Interest
Text Editor
Interest
Translations
Interest
Triaging
Interest
Undo
Interest
USD
Interest
User Interface
Interest
UV Editing
Interest
VFX & Video
Interest
Video Sequencer
Interest
Virtual Reality
Interest
Vulkan
Interest
Wayland
Interest
Workbench
Interest: X11
Legacy
Blender 2.8 Project
Legacy
Milestone 1: Basic, Local Asset Browser
Legacy
OpenGL Error
Meta
Good First Issue
Meta
Papercut
Meta
Retrospective
Meta
Security
Module
Animation & Rigging
Module
Core
Module
Development Management
Module
EEVEE & Viewport
Module
Grease Pencil
Module
Modeling
Module
Nodes & Physics
Module
Pipeline, Assets & IO
Module
Platforms, Builds & Tests
Module
Python API
Module
Render & Cycles
Module
Sculpt, Paint & Texture
Module
Triaging
Module
User Interface
Module
VFX & Video
Platform
FreeBSD
Platform
Linux
Platform
macOS
Platform
Windows
Priority
High
Priority
Low
Priority
Normal
Priority
Unbreak Now!
Status
Archived
Status
Confirmed
Status
Duplicate
Status
Needs Info from Developers
Status
Needs Information from User
Status
Needs Triage
Status
Resolved
Type
Bug
Type
Design
Type
Known Issue
Type
Patch
Type
Report
Type
To Do
No Milestone
No project
No Assignees
3 Participants
Notifications
Due Date
No due date set.
Dependencies
No dependencies set.
Reference: blender/blender#70242
Loading…
Reference in New Issue
No description provided.
Delete Branch "%!s(<nil>)"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
System Information
Operating system: Windows 10 pro
Graphics card: nVidia RTX2080
64 Gb memory
Intel i9 9900k CPU
Blender Version
Broken: 2.80
Short description of error
Earth.blend stars.hdr Rendering with environment texture 16,000 x 9,000 hdri of star field. Rendering in cycles with GPU compute using (oh no) Cuda. Crashes, see attached image. My system has 64G of memory. There is no memory issue but Cuda creates an issue because it is trash.
Exact steps for others to reproduce the error
Render the scene as-is.
Added subscriber: @CMalcheski
Added subscriber: @brecht
This texture takes up 2.15GB of memory (since it needs to be uncompressed for rendering).
Windows WDDM seems to impose a limit of 2GB on individual allocations for some graphics cards, and I'm guessing that is what is happening here.
The simple workaround is to use a smaller texture, there's no need for environment textures to be that high resolution in most cases.
Can you verify if a smaller resolution texture works, so we can confirm it is related to that?
We may be able to automatically detect such cases and instead allocate the texture on slower CPU memory then. But I couldn't immediately find the API to check that, if it exists.
Again With Cudato CUDA invalid value error when using 2.15GB textureAdded subscriber: @mont29
Changed status from 'Open' to: 'Archived'
More than a week without reply or activity. Due to the policy of the tracker archiving for until required info/data are provided.
Smaller worked. This was an obvious pointer overflow within Cuda. The texture size crossed the limit of positive values for a 31-bit integer (bits 0-30 being the value; bit 31 being the sign bit). With bit 31 set, pointers were being treated as negative numbers. Why are they using 32-bit ANYTHING with today's cards? Oh well. Now it's known that limit exists. This is why I hate high level languages.
[EDIT: "float" changed to "integer."]