

AI-generated code introduces a number of danger into the event course of. A latest Sonatype report discovered that AI hallucinated 27% of improve suggestions for open supply initiatives, whereas analysis from Veracode discovered that AI launched safety vulnerabilities in 45% of 80 coding duties throughout 100+ totally different LLMs. Now, new analysis from Black Duck is shedding mild on one other urgent situation associated to AI-generated code: IP and licensing dangers.
Within the firm’s 2026 Open Supply Safety and Danger Evaluation (OSSRA) report, it analyzed 947 industrial codebases and located that two-thirds of them had license conflicts—the best proportion within the historical past of the report. This represents a 12% improve from final yr, which additionally breaks a report for the most important bounce within the report’s historical past.
One of many codebases that Black Duck audited contained 2,675 distinct licensing conflicts, indicating the complexity of managing IP has grown exponentially.
“This rise is partly pushed by ‘license laundering,’ the place AI assistants generate code snippets derived from copyleft sources (like GPL) with out retaining the unique license info,” the corporate defined in a weblog put up. For instance, the report reveals that 17% of open supply parts are coming into codebases exterior of conventional package deal managers, by means of copy and pasted snippets, direct vendor inclusions, or AI technology. This presents a problem, as code that enters this fashion could also be invisible to conventional manifest-based scanning instruments.
This yr’s OSSRA report additionally discovered that the imply variety of vulnerabilities in code has almost doubled since final yr. Eighty-seven % of the codebases had a minimum of one vulnerability, 78% had high-risk vulnerabilities, and 44% had critical-risk vulnerabilities.
The corporate defined that it found a “zombie part” drawback when digging into the analysis. Ninety-three % of codebases contained parts that hadn’t seen lively improvement in two years, 92% contained parts that had been a minimum of 4 years outdated, and solely 7% of parts in use had been upgraded to the newest model.
“These deserted parts are a ticking time bomb. When a vulnerability is found in a challenge that hasn’t been touched in years, there’s usually no maintainer left to repair it. Organizations are left with tough decisions: fork the challenge, refactor the appliance, or settle for the chance,” the researchers wrote.
Black Duck concluded {that a} key takeaway from this yr’s report is that there’s a rising hole between AI adoption and governance.
“As regulatory strain mounts from frameworks such because the EU AI Act and Cyber Resilience Act, the ‘ship and overlook’ mannequin of software program supply is not viable. Organizations should transfer towards a mannequin of steady provide chain transparency, the place each part, whether or not human-written, AI-generated, or open supply, is accounted for,” Black Duck mentioned.
