Optibeef - TFP follow on funding

Project Details


The focus of the proposed project is to refine several aspects of the recently completed Optibeef project in order to achieve a commercially viable fully automated product ready for deployment.

The primary aim of Optibeef was to develop an enhanced decision support system integrating on-farm whole-life performance monitoring with detailed carcass measurements from precision agriculture technology to be deployed both on-farm and in abattoirs. Optibeef successfully accomplished the initial project aims including the development of innovative parameter extraction algorithms for both carcass and live animal images, highly accurate grading prediction models, as well as the development of accessible on-farm and abattoir data platforms.

However, to achieve commercial viability further work is needed. In particular, real-time image processing is needed to maximize utility to abattoirs, and UKIDs must be reliably linked with these images for on-farm decision support. To achieve commercial licensing, the accuracy of existing grading models must be improved and expanded to a 15-point scale, and a method for reliably assessing fat coverage must be devised.

Our key objectives are as follows:
1. Develop a fully automated software integration solution that provides reliable linkage between images and UKID and kill number. This will be achieved either by exposure to abattoir UKID software to Innovent software via API or the development of a scan-on prototype for tracking UKID in parallel with abattoir software.
2. Develop an algorithm that successfully and reliably predicts carcass fat coverage. This will include the installation of new RGB camera technology in abattoir, and exploration of fat depth solutions with new imaging technology and novel methods of determining fat depth with existing camera technologies.
3. Refine the image processing algorithm by incorporating real time abattoir independent parameter extraction, steam detection and mitigation algorithms, and an alert system to notify the abattoir of unfavourable environmental conditions that may interfere with image capture.
4. Develop commercial grade algorithm predicting saleable meat yield from carcass images that operates on a 15-point grading scale. This will include the exploration of novel machine-learning model alternatives for grading.
5. By the end of the project be confident that we have a robust abattoir imaging product ready to be Licensed by Defra. (This will be one of only two Licensed VIA options for UK abattoirs.)This will link with the already recently developed on-farm technology and provide the beef industry with an invaluable integrated management. It will facilitate informed decision-making that results in more animals meeting the abattoirs prescribed requirements, and producers better managing their inputs, thus reducing the impact on the environment, whilst increasing their return on animals reared.

The beef industry’s need for an integrated platform providing decision-making support to meet abattoir requirements is on-going, and the system in development is the most advanced and innovative solution in meeting these needs.


In this project SRUC’s role is entirely desk based. We will:
•Analyse data (using machine learning) captured throughout Opti-Beef to assess predictions of EUROP classifications (conformation/fat), Cold Carcass Weight and Yield (total saleable meat yield and yield/dimensions of individual primals.
•We will also provide scientific/technical advise to support optimal system design/specification.
•We will collate and analyse new data from (i) new RGB camera focussed on measurement of fat depth AND fat cover; (ii) new 3D camera data linked to grading data captured on the 15 pt scale.
•Support knowledge exchange activities: attendance at on-farm events, mixed media outputs (podcasts, webinars, social media), and scientific outputs (conferences/scientific publications).

Effective start/end date1/02/2431/01/25


Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.