Extracting Pasture Phenotype and Biomass Percentages using Weakly Supervised Multi-target Deep Learning on a Small Dataset. (arXiv:2101.03198v1 [cs.CV])

The dairy industry uses clover and grass as fodder for cows. Accurate
estimation of grass and clover biomass yield enables smart decisions in
optimizing fertilization and seeding density, resulting in increased
productivity and positive environmental impact. Grass and clover are usually
planted together, since clover is a nitrogen-fixing plant that brings nutrients
to the soil. Adjusting the right percentages of clover and grass in a field
reduces the need for external fertilization. Existing approaches for estimating
the grass-clover composition of a field are expensive and time consuming –
random samples of the pasture are clipped and then the components are
physically separated to weigh and calculate percentages of dry grass, clover
and weeds in each sample. There is growing interest in developing novel deep
learning based approaches to non-destructively extract pasture phenotype
indicators and biomass yield predictions of different plant species from
agricultural imagery collected from the field. Providing these indicators and
predictions from images alone remains a significant challenge. Heavy occlusions
in the dense mixture of grass, clover and weeds make it difficult to estimate
each component accurately. Moreover, although supervised deep learning models
perform well with large datasets, it is tedious to acquire large and diverse
collections of field images with precise ground truth for different biomass
yields. In this paper, we demonstrate that applying data augmentation and
transfer learning is effective in predicting multi-target biomass percentages
of different plant species, even with a small training dataset. The scheme
proposed in this paper used a training set of only 261 images and provided
predictions of biomass percentages of grass, clover, white clover, red clover,
and weeds with mean absolute error of 6.77%, 6.92%, 6.21%, 6.89%, and 4.80%
respectively.

Source: https://arxiv.org/abs/2101.03198

webmaster

Related post