As sustainable meat production and healthy eating become increasingly important, the global pandemic and supply chain disruptions have further exposed the limitations of traditional livestock farming. Many plant-based meat companies focus on ground meat alternatives because the texture is uniform, costs are lower, and replicating a fully marbled cut of steak still remains a major challenge. While technology is advancing toward realistic plant-based steaks, there is currently no ground truth dataset or standardized equipment to quantify the texture of animal cuts of steak, which is a critical step in closing this gap.
Goal: To develop a low-cost, high-fidelity meat scanning platform for 3D texture analysis of food products.
Purpose: Internal data collection at Redefine Meat to improve texture replication in plant-based meats.
The uncanny valley is a concept explored by Japanese roboticist Masahiro Mori to explain the discomfort people feel when robots closely resemble humans—but not perfectly. I believe this concept applies to alternative meats as well.
As plant-based companies get closer to replicating traditional cuts of meat, there’s a point where the resemblance is close, but not close enough—creating a feeling of revulsion or skepticism. However, once the product reaches a certain level of perfection in texture and appearance, the discomfort disappears, making it indistinguishable from real meat.
By developing a high-fidelity scanning platform, we can capture and quantify the texture of real steak cuts, helping to bridge this "valley" and make alternative meats more visually and texturally realistic.
To create a usable dataset, I needed to develop:
Step 1 had a clearer roadmap since most of the hardware and core processor was based on our custom food printer. Steps 2 and 3 required months of iteration before large-scale testing.
I designed all of the onboard electronics for the system and housed them in a modular control box made from aluminum standoffs and laser-cut acrylic panels. The system needed to be easily assembled and disassembled for transportation — everything had to break down neatly to fit inside an oversized Pelican case for overseas shipping.
One of the more interesting challenges I encountered was finding a way to display LED signals from the microcontroller — which was mounted 3 inches below the top surface of the control box. I discovered that with careful alignment, light could travel cleanly through a short rod of acrylic positioned just above the LEDs — similar to how fiber optic cables funtion. While not as advanced as fiber optics, they effectively transmit visual signals to the surface in a simple, low-cost way — perfect for quick status indication in a compact enclosure. Building on that idea, I created a set of thin, square acrylic rods that press-fit into the top panel, channeling the light from the board below. It turned out to be a low-cost, elegant solution — and one I was really proud of!
Given the machine’s multifunctional nature — and the fact that it would need to be reassembled and operated without my in-person instruction — I created a tutorial video to walk users through its core functions, alongside a custom graphical user interface (GUI) I designed for streamlined operation.
The GUI was developed in MATLAB and significantly simplified the data collection and processing workflow. I used it extensively in my own experiments and also had team members (with no prior experience using the machine) operate it as a way to test usability. Their feedback helped me iterate on the interface and improve the overall user experience.
Key features of the GUI included:
Aside from capturing texture data, the machine also needed to shoot overlapping images that could be later stitched together to produce a singular high-resolution image of the cuts of meat. This setup allowed us to take a sequence of images, then apply filtering and thresholding techniques to generate a marbling map that distinguishes fat from muscle.
By correlating this information with local texture data, we aimed to create a predictive framework that could estimate the toughness of a steak based on its marbling pattern. While our dataset wasn’t large enough to fully validate this model, we successfully built the framework for future research.
To measure texture properties, we used a low-cost 3-axis load cell outfitted with interchangeable attachments. These attachments allowed us to measure: Elasticity (Young’s Modulus), cutability, grain direction, tenderness, and recovery rate of muscle fibers.
The motion path of each “poker” could be customized, but for these experiments, followed a pre-determined path, with the load cell transmitting real-time data to MATLAB via a data acquisition device.
This data was then processed and overlaid onto visual maps, creating interactive, high-resolution texture maps of the meat samples.
This visualization represents just a fraction of the insights extracted from a single sample, but it also demonstrates that we successfully completed Step 3—laying the foundation for future advancements in alternative meat texture replication.
Alissa Sherbatov, Evan Tong, Samya Ahsan.
Advisor: Hod Lipson
Digital Cuisine
Digital Meat
Smar+er Care
Pineapple Packaging
CUE-V
Bluti Bros