This might be a bit of an old news, but the other day I saw that Gitlab issued a “DevSecOps landscape” survey. So I decided to take a look.
First, the usual caveat. Our industry is more diverse than you think, and so any survey like this by a vendor has biases and skews. For example, 29% of the respondents said they contributed to Gitlab (p.6), but I’m pretty sure not 29% of the entire software development industry has contributed to Gitlab. So you know what I’m talking about.
Too much AI/ML to be true
The report talks a lot about the prevelance of AI/ML. I have a feeling that the report authors aren’t distinguishing unintelligent bots that you interact with in pull requests, such as Rennovate, vs actual AI/ML based solutions like GitHub copilot (p.7). Then, some other parts of the report is hard to believe, such as the claim that 16% of respondents use “AI/ML to review code before a human sees it. I mean, is AWS CodeGuru that popular?
But discounting all those, one still cannot help but feel that AI/ML is getting adopted more and more.
- 41% of people reports they are using AI/ML for testing, up 25 percentage points from the last year (p.7)
- Developers in the survey naming “an understanding of AI/ML” as “the most important skill for their future careers”, and that number is at 30%, up from 22% last year. (p.15)
- The same answer scores pretty highly forsecurity experts and ops (p.19, p.23)
I’m in the business of bringing ML to tests, so naturally I also feel this shift is inevitable. We are getting disrupted by “big data” just like everyone else.
Gitlab’s market position
Gitlab users appear to be concentrated in Asia, followed by Europe and Russia (p.4). This makes sense to me, because in that part of the market, the labor cost is lower, so GitHub is relatively more expensive. I’m willing to bet that “Europe and Russia” here is concentrated in lower cost countries like south & east. That speaks to the power of free product!
As a side note, I’d love to hear any success story of a service company charging different price for different regions of the world. Whoever figures that out will make a lot of money, I can tell you.
Also, I couldn’t help but chuckle that even among this Gitlab audience, 21% of the respondents use Jenkins for CI/builds. Go Jenkins! (p.7) IMHO, a flexible general purpose tool always win out in the end. I mean, think about all the things you use spreadsheets for…!
So while Gitlab might be trying to become all-encompassing one stop shop, the reality appears to be that it’s still primarily used as the source code hosting platform. Nothing wrong with that.
Oh, BTW, 14% said they use GitHub actions with Gitlab. What!? Why!?
The survey asks questions about “DevOps platform” for whatever that means (p.8). I have a feeling that different people at different maturity level will picture very different things when they hear “DevOps platform”.
There’s a definite, decade long trend that more of the tools and services that we use are getting integrated, managed, operated, and improved by one team and consumed by other teams. I’ve always believed that different companies producing different software at different maturity needs different solutions, and thus I believed in the benefit of assembling tools/services best for you. As a creator of Jenkins and an operator of Launchable, I still believe very much in tools/services that does one thing and interfaces with others well.
Gitlab primarily used as the source code repo also fits nicely in this narrative.
Testing is the problem
Anecdotal quotes from the survey (p.6):
For the third year in a row, a majority of survey takers resoundingly pointed to testing as the area most likely to cause delays.
Testing can be both slow in writing and running
business tests take time to be complete (2-4 days on average)
There are more great anecdotes of similar nature in p.11, p.12, & p.14.
To get to continuous delivery, we need to assure quality, so we have automated tests built-in. Investing in these areas allowed our team to deploy 2000 times to production over a year, where in the past we would deploy maybe 6 times.
We automate everything possible, to be able to test our product ‘like in real life’ without any downside. This increases confidence and simplifies tests for everything.
Integration testing has been a big plus in how confident we are to release automatically and deliver a version. We are now able to deliver any day
The key speed limiter is the testing, or more broadly speaking, the exercise of building confidence to the changes you just produced. This is encouraging news to me as I’m tackling this same challenge.
There’s more automated tests now and that’s rapidly increasing (p.7), but people still point out “more testing of all types” as the key missing link (p.9).
The other pillar of higher confidence is code review. This practice is still growing and so is the pain (p.14)
Security, but …
So people talk about DevSecOps, and the report talks about more people are integrating security into delivery pipeline. 70% said they shifted security left (p.18)
But the report also shows that what’s done is limited. Basically, it amounts to container & dependency scan, and perhaps license compliance check. IOW, static analysis that can easily fit the delivery pipeline.
One phrase to summarize the report is “more of the same”. Well understood best practicies, such as CI, CD, code review, test automation, security scan, … are being taken on by the main stream, done by a lot more people. I’m seeing the tide lifting slowly and steadily, moving tremendous mass of water.
I’m not really seeing any new real change, except this unexplainable insistence of AI/ML in the development process. It could be just that the survey designer did a poor job in coming up with questions, but I’d be certainly watching out for more news in this space going forward.