Comparison of Automated Code Review Tools: Codebeat, Codacy, Codeclimate and Scrutinizer

Photo of Łukasz Ozimek

Łukasz Ozimek

Updated Jan 25, 2023 • 21 min read
StockSnap_H2SNXCYG1T-434699-edited

Code review makes your code stronger, reduces the risk of overlooking major bugs and supports the culture of feedback.

The process constitutes an inherent part of best practices in web development, so skipping it in your projects might be a big mistake. Luckily, it can be easily and effectively conducted thanks to numerous automated code review tools that are available on the market. Don’t get overwhelmed by the number of providers! With our in-depth analysis, you’ll find a solution to fit your needs.

Netguru builds digital products that let people do things differently. Share your challenge with our team and we'll work with you to deliver a revolutionary digital product. Create a product with Netguru.


Code Review at Netguru

At Netguru, Code review is a crucial element in product development, and it strongly supports the Test-Driven Development (TDD) process to offer the best practices available in developing Rails, CSS and JavaScript code. It helps us make the project better, faster and more secure through making the code bulletproof. It is also an opportunity for people, especially newbies, to learn good code quickly.

Our developers used to test the correctness of our code with CodeClimate. However, with increasing needs and expectations, we decided to seek a better alternative that would have more advanced features. Our R&D team conducted an in-depth research, and that’s how we found CodeBeat, a tool nearly tailored for our business. However, we are aware that your dev team might require different features from the code review provider. Therefore, we share the results of our research so that you can pick the solution which is right for you and save time on doing your own research.

Codebeat

Codebeat is a dynamically growing tool that covers major technologies and programming languages. It has evolved substantially within the last few months and it’s worth noting that the team is open for feedback and implementation of new features suggested by their users. However, there are still some things missing. The tool does not allow performing any security checks (perhaps a “simple” integration with Brakeman would do the job). It also does not support any open-source tools or linters (in fact, we are still using Hound) and does not support analysing CSS/SCSS.

Pros:

  • support for most of the languages we use (Ruby, JS, Swift, Objective-C),
  • metrics customisation,
  • measuring tools with own algorithms nicely described in the docs (not just a bunch of open-source projects combined together),
  • very good support from their team,
  • small but well-documented API, which facilitates management, e.g. it provides accesses to users working on selected projects (via teams),
  • unique quick wins tab,
  • `codeclimate-test-reporter` gem that integrates Codebeat with Simplecov coverage reports,
  • customer suggestions considered and implemented in the product,
  • a dynamically growing tool,
  • support for Kotlin and Elixir (beta).

Cons:

  • still missing some things in the documentation (e.g. explanation of code duplication detection here),
  • no possible security issues check,
  • no CSS/SCSS analysis.

Possible issues:

  • I don’t see an option to list all issues found (with the option to search by category) - there’s only the option to check every single file or a “Quick Wins” tab,
  • performance is sometimes surprisingly poor.

Codacy

Codacy has an awesome UI, lots of features and it’s very flexible thanks to dozens of options. There is also a tool (in beta) which allows you to define your own patterns and implement it to be checked automatically. Frankly, it wasn’t easy to find disadvantages of this tool or any clients’ complaints. However, there is still some area for improvement and potential to grow for Codacy.

Pros:

  • used by big players like Paypal or Adobe,
  • great and intuitive UI,
  • the possibility to define issue-based goals to improve the codebase,
  • checking lots of security issues (like assigning strange values to private APIs which may lead to unexpected app behaviour),
  • a nice feature of browsing commits and monitoring related issues,
  • docker analysis,
  • huge flexibility thanks to disabling/enabling patterns or whole packages and ignoring certain patterns in the selected files,
  • time to fix estimation for each issue,
  • small company growing fast, delivering fresh features frequently,
  • well-described issues with examples right below each case (no need to browse the documentation to find out why the issue occurred).

Cons:

  • incomplete documentation in some parts (some images are hard to read and the amount of information is insufficient sometimes),
  • unintuitive one-page charts to track code quality changes over time (it’s a new feature, maybe not fully implemented yet),
  • no `hotspots` or `quick wins`,
  • no issues searching, only a few drop-down filters.

Possible issues:

  • we don’t know how the support works there,
  • we also don’t know how flexible they are in terms of implementing clients’ suggestions into their product.

people-coffee-tea-meeting.jpg

CodeClimate

CodeClimate is a well-developed and very stable solution with a great number of features. It has many advantages over its competitors and many big players recommend it as the best option. However, it lacked some crucial functionalities that we required for the Netguru code review process, so we switched to an alternative solution.

Pros:

  • a great number of supported languages, technologies and frameworks,
  • used by the biggest players, including Pivotal, New Relic for enterprise and Rails, jQuery for open-source,
  • very stable,
  • nice new UI,
  • well-maintained test coverage feature gem,
  • browser extensions,
  • trends charts,
  • test coverage out of the box,
  • hotspots - a `quick wins` list.

Cons:

  • seemingly an integrated bunch of open-source projects,
  • pricing - it seems to be the most expensive tool in this comparison,
  • still unpredictable API (in beta),
  • no support for Objective-C,
  • no distinct types for total issues number,
  • no detailed description of the issue, only a header with source code,
  • no issue searching/filtering, just a paginated list with all of them,

Possible issues:

  • no interest from the Code Climate team to extend the tool the way a customer may suggest.

pexels-photo-205316-815702-edited.png

Scrutinizer

Scrutinizer doesn’t stack up well against other solutions. Actually, it seems to lack basic attributes that are necessary for code review. Analysing the code using different a Ruby version than defined in the repo is actually what disqualifies it. It shows `tests failed` even when the tests passed on CircleCI and locally. You can not reliably test the repository if you’re using a Ruby version other than the ones available through the dashboard. It shows lots of issues and errors with the “parser could not interpret the code” message. Really strange for a tool that you’re supposed to pay for.

Pros:

  • seems to have a very good API,
  • it’s actually one of the cheapest solutions if we ignore the performance,
  • automatically detects code changes in the dashboard,
  • well-written documentation,
  • feature of filtering issues by users,
  • dedicated site with the current status of the services,
  • extended team management, possibility to create organisations.

Cons:

  • available only as a predefined version (e.g. no option of running tests in 2.3.0 version),
  • dead test coverage gem (only 2 commits 3 years ago),
  • API available only when subscribed to the most expensive package,
  • unknown warnings on Regexps (e.g. treating `/` like a division instead of the beginning of the regex when no space after it),
  • not a predictable tool.

Possible issues:

  • performance: you pay per container, and a container is simply one task that can be run at once, fewer containers - less performance and longer waiting time for your code to be analysed,
  • it’s possible to configure checker (Settings -> Configurations -> Checks), but there’s no way to restore default configuration or even set a default for an entire organisation,
  • it doesn’t read class names properly (it should include namespace - module), it’s harder to find it this way,
  • files/issues are paginated in the way that enforces you to click `next` every time you want to move to the next - say - 10 classes,
  • there’s no option to see all the rules (checks), there are two groups for enabled or disabled rules, but there’s no “all” option.

Code Review Tools Comparison Summary

Supported Languages and Technologies:

CodeClimate : Ruby/Rails, JavaScript, Python, PHP, Swift, SCSS/CSS, Go, CoffeeScript, Apex, Ember, ESLint, Haskell, Haxe, RubyMotion, Vim Script;

Codebeat : Ruby, Javascript, Python, Java, Swift, Go, Typescript, Objective-C, Kotlin, Elixir;

Codacy : Ruby, JavaScript, Python, PHP, Java, Swift, CSS, TypeScript, CoffeeScript, Scala, C/C++, Dockerfile, SASS, Shell Script;

Scrutinizer : Ruby/Rails, Javascript, Python, PHP.

Measuring Tools:

CodeClimate: many existing, open-source tools like Rubocop, Brakeman, CSS/SCSS Lint, ESLint, Flog, etc. full list here ;
Codebeat: their own algorithms and implementation written from scratch see how it works ;
Codacy: many existing, open-source tools like Rubocop, Brakeman, CSS/SCSS Lint, ESLint, Flog etc. full list here ;
Scrutinizer: closed-source codebase with a possibility to use open source tools. OS tools list here .

Pricing:

CodeClimate: $16,67/user/month (when billed yearly - otherwise $20);
Codebeat: $20/user/month, but you can probably negotiate;
Codacy: $15/user/month;
Scrutinizer: EUR200/month for unlimited number of users. 2 containers in the plan, where 1 is just 1 task at a time, extra container costs EUR50.

Documentation:

CodeClimate: very good and comprehensive;
Codebeat: still some things missing;
Codacy: not bad, not too much text and some images are not clickable so the readability is limited;
Scrutinizer: good.

API:

CodeClimate: yes, still in beta version;
Codebeat: yes, simple but usable;
Codacy: yes, not described perfectly in their docs;
Scrutinizer: seems to be really good - read more .

CodeClimate Codebeat Codacy Scrutinizer

Coverage report

yes, link

yes, link

yes, link

yes, link

Security analysis

yes

no

yes

yes

Team per project

yes

yes

yes

yes

Github PR integration

yes

yes

yes

yes

Slack integration

yes

yes

yes

yes

Jira integration

yes

no

yes

no

total no. of integrations

13

6

10

4

Wrap-up

After a deeper look at all the tools described above, I can disqualify Scrutinizer as it doesn’t support all Ruby versions, has many issues which don’t allow to do the metrics reliably and sometimes makes it actually impossible. On the other hand, Codebeat seems to be a comparable tool to CodeClimate or Codacy. In case of these three solutions, the pros seem to overweigh the cons and a particular feature might clinch the final choice, which would stem from individual needs. Just like it did in case of Netguru.

The article was originally published here .

Tags

Photo of Łukasz Ozimek

More posts by this author

Łukasz Ozimek

Lukasz is a hobbyist Ruby/RoR developer, who decided to join Netguru and start a professional...
Efficient software development  Build faster, deliver more  Start now!

Read more on our Blog

Check out the knowledge base collected and distilled by experienced professionals.

We're Netguru

At Netguru we specialize in designing, building, shipping and scaling beautiful, usable products with blazing-fast efficiency.

Let's talk business