Page MenuHome

Cycles: Split kernel error with OpenCL
Closed, ResolvedPublicBUG

Description

System Information
Operating system: Windows 10
Graphics card:

  • Radeon RX 570 Series
  • RX480 8G, 20.2.2 driver version
  • Radeon RX550/550 Series ATI Technologies Inc. 4.5.13596 Core Profile Context 20.Q4 27.20.11027.5002

Blender Version
Broken: 2.91.0, 2.92.0
Worked: ?

Short description of error
GPU rendering fails with a split kernel error and often in combination with an OpenGL shader compilation issue due to seemingly incorrect reported extensions by the device.

Exact steps for others to reproduce the error

  1. Configure the user preferences to use OpenCL
  2. Set the Device to GPU in the Render Properties
  3. Either set the Viewport Shading to Rendered or render the scene F12

Revisions and Commits

Related Objects

Event Timeline

There are a very large number of changes, so older changes are hidden. Show Older Changes

Having the same problem.. Recerting to blender 2.9.0 fixed it. But I wonder if it will work on future releases.

Bodhi (Bods) added a comment.EditedFeb 25 2021, 7:58 PM

System Information
Operating system: Windows 10 Home
Graphics card: AMD Radeon Vega 3 graphics

Blender Version
Broken: 2.92.0

Short description of error
GPU rendering fails (most often Rendered Viewport Shading in the Viewport) with a split kernel error ... Split kernel error: failed to load kernel_indirect_background
....Compiling render kernels
...Split kernel error: failed to load kernel_indirect_background
Cube doesn't render (100% transparent)
Changing modes causes freeze

Exact steps for others to reproduce the error

Configure the user preferences to use OpenCL
Set the Device to GPU in the Render Properties
Either set the Viewport Shading to Rendered (or render the scene, except rendering works sometimes)

ersin (Kad) added a subscriber: ersin (Kad).EditedMar 14 2021, 5:19 PM

Hey there,

had the same problems with cycles viewport render with my RX580 since blender 2.8
This solution worked for me blender 2.92:
Render your image to load all the kernels F12. After the kernels are loaded you can cancel the render.
Now the render preview in the Viewport should work.

i can cofnirm that in latest 21.3.1 Beta as of time of posting this it is still broken, i understand it is a difficult problem but it is so disappointing that AMD has no regard to make it work on softer they invest money on and actually donate their own developers to help develop it but they cant enable it to work in their own driver for 6 months now. what a disappointment from amd, if i get a chance to sell my rx 6800 and buy nvidia i will certainly do it without so much as a blink. someone please remind me to not buy amd gpu every again!

i can cofnirm that in latest 21.3.1 Beta as of time of posting this it is still broken, i understand it is a difficult problem but it is so disappointing that AMD has no regard to make it work on softer they invest money on and actually donate their own developers to help develop it but they cant enable it to work in their own driver for 6 months now. what a disappointment from amd, if i get a chance to sell my rx 6800 and buy nvidia i will certainly do it without so much as a blink. someone please remind me to not buy amd gpu every again!

Not even sure this is entirely on AMD given that 2.90 and down still work perfectly.

i can cofnirm that in latest 21.3.1 Beta as of time of posting this it is still broken, i understand it is a difficult problem but it is so disappointing that AMD has no regard to make it work on softer they invest money on and actually donate their own developers to help develop it but they cant enable it to work in their own driver for 6 months now. what a disappointment from amd, if i get a chance to sell my rx 6800 and buy nvidia i will certainly do it without so much as a blink. someone please remind me to not buy amd gpu every again!

Not even sure this is entirely on AMD given that 2.90 and down still work perfectly.

ofcourse its amd. my eevee shaders take 10 minutes to compile but on opensource shaders in linux they take 30 seconds. if they bothered to take a clue from open source drivers they would made it 20 times better but they cba with that, important thing cyberpunk works properly with radeonboost and antilag works with dx12

Make sure you installed the latest amd pro driver as it’s suggested in the blender manuals

Make sure you installed the latest amd pro driver as it’s suggested in the blender manuals

if you can tell me where i can get this pro driver then i will go download it, but to my knowledge there is no such thing on amd.

Make sure you installed the latest amd pro driver as it’s suggested in the blender manuals

if you can tell me where i can get this pro driver then i will go download it, but to my knowledge there is no such thing on amd.

Here you go https://www.amd.com/en/support/kb/release-notes/rn-pro-win-20-q4

Make sure you installed the latest amd pro driver as it’s suggested in the blender manuals

if you can tell me where i can get this pro driver then i will go download it, but to my knowledge there is no such thing on amd.

Here you go https://www.amd.com/en/support/kb/release-notes/rn-pro-win-20-q4

Those are for radeon pro series graphics cards. It clearly says on its page the list of supported gpu's and rx 6000 series is not on that list. Is there a good reason why you suggesting me a driver that is not supported on my gpu? What am I missing?

Make sure you installed the latest amd pro driver as it’s suggested in the blender manuals

if you can tell me where i can get this pro driver then i will go download it, but to my knowledge there is no such thing on amd.

Here you go https://www.amd.com/en/support/kb/release-notes/rn-pro-win-20-q4

Those are for radeon pro series graphics cards. It clearly says on its page the list of supported gpu's and rx 6000 series is not on that list. Is there a good reason why you suggesting me a driver that is not supported on my gpu? What am I missing?

It's not on the list but it should work, at the very least I know they work for the series 5000 and down

Unfortunately the rx6800 is not supported. For my rx580 it solved the issue with open cl GPU rendering and viewport rendering.

Unfortunately the rx6800 is not supported. For my rx580 it solved the issue with open cl GPU rendering and viewport rendering.

Interesting.. why dont they list it then? I imagine a lot of people with rx 400 and 500 would solve the problem if they knew it worked

AMD RX590 video card, system crash on any driver version in Cycles. I gave up the idea of buying AMD video cards forever. This problem has not been solved for a very long time.

AMD RX590 video card, system crash on any driver version in Cycles. I gave up the idea of buying AMD video cards forever. This problem has not been solved for a very long time.

Use LuxCore Render, way better than cycles, faster and godly stable with amd cards, it's free you can put inside Blender or use it as standalone. \o/
BF invested too much on cycles, don't know why they just don't absorb LuxCore Render as it works with all brands, i don't think that's gonna change before 3.0... anyway... the fun fact that a open source program relies heavily and almost esclusively with proprietary tech... other thing that i don't understand is why they don't hire the creator ofLuCore render as that guy really knows how to render with opencl, gl, etc.

https://luxcorerender.org/

AMD RX590 video card, system crash on any driver version in Cycles. I gave up the idea of buying AMD video cards forever. This problem has not been solved for a very long time.

Use LuxCore Render, way better than cycles, faster and godly stable with amd cards, it's free you can put inside Blender or use it as standalone. \o/
BF invested too much on cycles, don't know why they just don't absorb LuxCore Render as it works with all brands, i don't think that's gonna change before 3.0... anyway... the fun fact that a open source program relies heavily and almost esclusively with proprietary tech... other thing that i don't understand is why they don't hire the creator ofLuCore render as that guy really knows how to render with opencl, gl, etc.

https://luxcorerender.org/

Problem with all addon renderers for blender is that they use double the amount of vram then cycles because data has to be duplicated one instance of data on blender and one instance of exact data on renderer. if you have 8gb of vram effectively you have 4gb of vram etc. that in itself is horrible limitation that all additional renderers suffer from. cycles doesnt have this problem since it is tightly integrated with blender. furthermore, luxcore is sure nice for architecural interior renderings but there is no way it can keep up pace of development that cycles have since luxcore has virtualy no funding. while yes it can produce some nice renders and has some advanced edges over cycles, it is imbalanced like its better in some areas worse in some others, overall it is more bare bones renderer with some nice bells and whistles that will come to cycles sooner or later like light cache and stuff. and luxcore is not exempt from kernel errors, its just that in luxcore you dont know when they will strike. here at least i know it doesnt work and thats the end of it.

give your rx 6800 to your niece to play cyberpunk without raytracing thats the only thing its good for, like what does she know what raytracing even is at least she will have radeon boost and antilag will be O K im sure she will figure out what radeon boost is since amd control panel is getting more and more intrusive on each version. first they start with annoying overlays next thing they are displaying ads in your brain to justify lower price of already over expensive hardware mark my words your niece is not safe when amd gpu division strikes back.

bottom line - get nvidia for professional work at least renderer will fire up if nothing else.

@Rafael (DrkCrb) @I did must have done it (leonardos) Friendly reminder that this is the bug tracker and that all discussions should stay on the topic of this task. Any other discussions can be continued on our community websites.

@Rafael (DrkCrb) @I did must have done it (leonardos) Friendly reminder that this is the bug tracker and that all discussions should stay on the topic of this task. Any other discussions can be continued on our community websites.

Yes thankses for remind me! I had totally forget this its a bug tracker! Silly me! EDIT: but i think that such a severe bug not be fixed so long contributed significantly to my dementia, so i think i can be forgiven, yes?

Possible workaround in rB91c44fe88547: Cycles: disable NanoVDB for AMD OpenCL.

I could not reproduce the issue myself, so any confirmation if this avoids the problem would help. (Once the Buildbot is updated, the Windows one seems to be down right now but should be up again soon.)
https://builder.blender.org/download/

Possible workaround in rB91c44fe88547: Cycles: disable NanoVDB for AMD OpenCL.

I could not reproduce the issue myself, so any confirmation if this avoids the problem would help. (Once the Buildbot is updated, the Windows one seems to be down right now but should be up again soon.)
https://builder.blender.org/download/

looks like amd is back on cycles after long hiatus. i can confirm that cycles its working since your patch on my amd gpu. i dont even know what nanovb is? is it something to speed up volumetrics? well at least it works now, not everyone needs faster volumetrics.

anyways, works fine now. thanks !

Possible workaround in rB91c44fe88547: Cycles: disable NanoVDB for AMD OpenCL.

I could not reproduce the issue myself, so any confirmation if this avoids the problem would help. (Once the Buildbot is updated, the Windows one seems to be down right now but should be up again soon.)
https://builder.blender.org/download/

actually after some more testing it appears it still has severe problems with compiling opencl kernels. at least in my case, it appears to work correctly if i 1st plug in an exr map into the background node and then into world. BUT, if i just open blender and just switch to gpu rendering with only background node plugged in then it will fail again :

OpenCL error (-52): CL_INVALID_KERNEL_ARGS in clEnqueueNDRangeKernel
OpenCL error: CL_INVALID_KERNEL_ARGS in clEnqueueNDRangeKernel()
OpenCL error: CL_INVALID_KERNEL_ARGS in clEnqueueNDRangeKernel(cqCommandQueue, kernel, 2, NULL, global_size, NULL, 0, NULL, NULL) (D:\_blender\_git\2.93\blender\intern\cycles\device\opencl\device_opencl_impl.cpp:1263)

so again, it works if map is plugged into background and it appears to work also if you disconnect the map from background but it errors if you try to use just background without first plugging in anything into it

it appears that the error happens if 1. scene is empty 2. only background node is plugged in the world output. so i guess, its not a big problem since nobody needs an empty scene. it seems to work if there is stuff in the scene or if a map is plugged in the background

Interesting.. why dont they list it then? I imagine a lot of people with rx 400 and 500 would solve the problem if they knew it worked

While it frustratingly isn’t listed on the driver webpage, it is listed on page 12 of the full release notes.

The latest Pro Driver was released March 24. Link

Under “Fixed Issues” it lists:

Image corruption and stability issues in Blender application.

Interesting.. why dont they list it then? I imagine a lot of people with rx 400 and 500 would solve the problem if they knew it worked

While it frustratingly isn’t listed on the driver webpage, it is listed on page 12 of the full release notes.

The latest Pro Driver was released March 24. Link

Under “Fixed Issues” it lists:

Image corruption and stability issues in Blender application.

I gave it a try. It makes no difference for my gpu and arguebly its run worse .

ok so I don't know if its work for you but it did work for me and I hope it should work for you.

I have the same problem in blender:

split kernel error: failed to load kernel_indirect_background

etc.

and the solution I got is that download blender 2.8 because 2.8 work with amd gpu now I know most of u don't want to use blender 2.8 so now go to file explorer and open blender 2.8 folder and search amd now copy all of the files in result bar and now go to blender which "version you wanna use" and go to that version of blender open file location and paste it and you are done now you can use cycles rendering with amd gpu I hope this work for you

Lingfox (Lingfox) added a comment.EditedMar 31 2021, 1:14 PM

ok so I don't know if its work for you but it did work for me and I hope it should work for you.

I have the same problem in blender:

split kernel error: failed to load kernel_indirect_background

etc.

and the solution I got is that download blender 2.8 because 2.8 work with amd gpu now I know most of u don't want to use blender 2.8 so now go to file explorer and open blender 2.8 folder and search amd now copy all of the files in result bar and now go to blender which "version you wanna use" and go to that version of blender open file location and paste it and you are done now you can use cycles rendering with amd gpu I hope this work for you

Not exactly the solution I'm looking for. For me, Cycles was also working on blender 2.91.2

Can anyone confirm that they still have this issue? I got no clear replies so far. I can't reproduce any issue with my RX 5700 XT card with latest master.

If you do still have an issue, please provide complete information so we can determine what exactly is different and try to reproduce that setup. In particular:

  • Graphics card name
  • Driver version
  • .blend file scene used for testing
  • Test with with latest Blender 2.93 from https://builder.blender.org/download/ and mention the exact git hash
  • Test with Blender 2.92
  • If using multiple GPUs, test if using just one works

The issue with empty scenes mentioned by @I did must have done it (leonardos) I have not been able to reproduce so far.

Brecht Van Lommel (brecht) changed the task status from Needs Triage to Needs Information from User.Apr 12 2021, 4:00 PM
Brecht Van Lommel (brecht) changed the subtype of this task from "Report" to "Bug".

I was able to replicate the problem the first time today with the Default Cube.

Split kernel error: failed to load kernel_holdout_emission_blurring_pathtermination_ao

But after the kernels compiled again when reloading the file, the problem disappeared.

I was also able to replicate in this simple file:

Split kernel error: failed to load kernel_do_volume

But the second time I tried it, the kernels compiled, however it takes forever to start previewing in the viewport.
Works well rendering via F12.


Operating system: Windows-10-10.0.19041-SP0 64 Bits
Graphics card: Radeon (TM) RX 480 Graphics ATI Technologies Inc. 4.5.14760 Core Profile Context 20.45.37.01 27.20.14537.1001
Blender Version: 2.93.0 Alpha, branch: master, commit date: 2021-04-11 20:10, hash: rB09d7d88cc42a

The problem is still present for me in both 2.92 and 2.93 Alpha 09d7d88cc42a

When I try to do a viewport render with GPU Compute enabled it takes a while before throwing out a Split kernel error: failed to load kernel_indirect_background error

Rendering via F12 seems to work fine in both versions

Graphics card: Radeon RX 470
Driver Version: Driver Version 20.50.03.05-210326a-365573E-RadeonSoftwareAdrenalin2020

The issue still occurs. Tested with Blender 2.93 hash 09d7d88cc42a and Blender 2.92. Here is a screen capture in case it's helpfull. I have 2 GPUs and for the test purpose I disabled one of them in the preferences.

  • Graphics card name: Radeon RX 580 Series
  • Driver version: 20.50.03.01-210310a-365275C-RadeonSoftwareAdrenalin2020 (21.3.1)
  • .blend file scene used for testing:

Lingfox (Lingfox) added a comment.EditedApr 13 2021, 5:04 PM

I still have this problem, if that helps.
(Sapphire Pulse) RX 5700 XT (with driver version from adrenalin 21.3.2); Blender 2.92; Default scene, viewport just won't work. Although, rendering will work.
EDIT: I still have this issue with the Adrenalin version 21.4.

I have the same problem with a Radeon (TM) R7 M340. using Adrenaline 2020.

It began medio December and then it was fixed for a while mid januari. But with newer releases of Blender it came back. Ik cannot use the realtime GPU render, nor render any image with it due to a split kernel error. Today I clean reinstalled all AmD drivers, but it did not change a thing.

This may have been worked in around in rB3e472d87a8d1: Cycles OpenCL: disable AO preview kernels, which solves an issue that could happen specifically in the viewport.

After Brechts comment: just checking if this is still an issue?

@Philipp Oeser (lichtwerk) , check T88521#1165792 (Author of the ticket said the problem is fixed in 3.0.0)

Lingfox (Lingfox) added a comment.EditedJun 5 2021, 1:00 PM

After Brechts comment: just checking if this is still an issue?

not for me, on blender 2.93.0 (release).
But then again, the issue was also fixed back in 2.92.0 and then came back on 2.92.1 onward.

Brecht Van Lommel (brecht) closed this task as Resolved.Jun 7 2021, 2:02 PM
Brecht Van Lommel (brecht) claimed this task.

I'll assume it's solved then, if it comes back we can reopen the report.