Farming Robot Kills 200k Weeds Per Hour With Lasers – IOTW Report

Farming Robot Kills 200k Weeds Per Hour With Lasers

[…] By combining artificial intelligence with precision laser technology, companies like Carbon Robotics are reshaping the way farmers tackle one of agriculture’s most labor-intensive tasks. These futuristic machines offer a glimpse into the potential of sustainable farming, where innovation meets efficiency, paving the way for a healthier and more productive future for agriculture.

25 Comments on Farming Robot Kills 200k Weeds Per Hour With Lasers

  1. “These futuristic machines offer a glimpse into the potential of sustainable farming, where innovation meets efficiency, paving the way for a healthier and more productive future for agriculture.”

    Reads like the press for every cockamamie scheme that comes out of the Urban planning division in every jurisdiction in Washington State that is infested with University of Washington Urban Planning School graduates. It’s formulaic and relies on a rearrangement of the same buzz words.

    7
  2. what is the criteria for getting the laser?
    no mention of that. How does it discriminate?
    It is good in theory, but I’d like to know it doesn’t kill everything that isn’t ’roundup ready’

    4
  3. Farming 101
    The real purpose behind GMO corn and soybeans or Round-Up Ready crops, is money.
    The seed and all subsequent generations of those seeds are copyright protected. If you get caught replanting that seed the next year, you will be charged with copyright rules and banned from buying seed in the future.
    It was common prior to GMO seed to reuse the seed harvested the previous year to plant the following year, reducing expenses.
    My dad did it for decades. He would research new seed brands and add in maybe 10% of new seed each year. Maybe the yield was slightly lower, but it cost $4-5000 less to plant.

    5
  4. I’ve worked a lot with machine vision. Machines dont see the world as we do, and lighting is everything. Its tough to have a lighting scheme that works from full dark to bright sun to random flashes of lightning, and feed this information back to a database that could differentiate weeds in every stage of development from desirable crops in every stage of development. Whether doing pattern recognition, color recognition, blob location differentiation, or some combination, machine visual recognition is largely dependant on consistency and God’s Creations are all inconsistent individuals that come in different shades and shapes even in the same species.

    Also, best case, you’re burning the tops off the weeds. Anyone who’s ever had dandelions knows very well that you have to get the root, and this doesnt do that.

    Good luck with that.

    2
  5. @SNS — I wonder what possibilities exist for weed recognition in non-visible light wavelengths? More data to work with could mean better discrimination.

    I want a smaller residential model optimized for Florida lawn sedge.

    2
  6. Uncle Al
    Sunday, 24 November 2024, 17:19 at 5:19 pm
    “@SNS — I wonder what possibilities exist for weed recognition in non-visible light wavelengths? ”

    …I’ve worked in IR and UV spectrum applications with polarizing and color filters too, and this sometimes solves some problems but causes others. UV has some obvious issues in bright daylight, and Ive had even laser scanners outside visible wavelengths get dazzled by bright sunlight shafts, and the more you obstruct your optics with various filters, the more issues you have with external interference such as dust or smoke.

    Are farms dusty?

    …also, this again causes consistency problems for pattern/color matching. A plant covered in dust or splattered in mud after a hard rainfall, or even in bird droppings, may not match the database model on either of those terms. Outdoor farming conditions dont lend themselves to consistency no matter what term you are modeling on.

    Also, plants grow. This can change the distance to the optics and so change the image presentation, as will the way the plant morphs as it grows and flowers or seeds.

    That machine will have to do a ton of image processing somehow before you can even begin to think about targeting.

    2
  7. “Also, best case, you’re burning the tops off the weeds.”

    I did not watch the video however I have quoted machined parts for two different manufacturers of ag laser weed control. The laser shoos the roots, not the leaves. Two inches of top soil is not going to stop a laser. Agriculture has become so automated we don’t need any southerners.

    4
  8. Also the weed I.D. thing. I kind of found this interesting two years ago when we were talking to these peeps. It was a lot easier to program the software to determine if it doesn’t look like this, kill it. So in other words if torchy is headed down a row of corn and runs across something that doesn’t look like corn it roasts it. Same for tomatoes, potatoes, etc etc. Makes total sense. A lot less work.

    1
  9. …not to get too far into the weeds here (heh), and I’m not saying the challenges are insurmountable, but as I do not work for a machine OEM or a giant food company (directly), my own personal access to machine vision systems is off-the-shelf Keyence, Banner, VISOR, systems like that, and they all use variants of the same three modes of recognition; blob, pattern, and color match.

    What you are describing seems most like pattern match, which can be used to trigger an output based on matched or not matched, meaning that the system could be given a model of a stalk of corn, a leaf of corn, something specific to “corn”, and then trigger the kill output based on not being able to match that model within a certain percentage of variance. You could also color match what any part of the corn color is and reject based on it as well, again within a percentage, but where this seems problematic to me is that “corn” is not itself a static system but something that changes as it grows, grows differently and has different colors based on how it’s fed and watered or not, and as this is presented as a 24/7/365 system is presumedly being inspected in all sorts of lighting, weather, and wind conditions which are going to change how long it has to process an image that is moving of a stalk, leaf, whatever, and is going to be doing it at times through some obscuring such as dust or water droplets or even heat shimmer and sunlight from all angles and intensities, as well as different heights depending on where the corn is in its growth cycle as it starts out quite small and gets very tall.

    It is possible to have multiple images to compare and take multiple images for comparison, but this slows down your processing considerably, as does changing the resolution of the image you are inspecting. Because of the height difference focus is going to be an issue as well, while some cameras can and do auto-focus, again this takes time and these machines don’t look like they are allowing much time based on how fast they are moving in the video.

    I’m not saying it can’t be done by people who build the machines, the cameras, and the software, not at all; I’m just saying that if they CAN inspect images of such variety with any accuracy at speed then I want to know if this is available commercially for MY applications because my off-the-rack stuff sometimes has issues inspecting a very consistently shaped bowl lid at a fixed distance and focus in a consistently illuminated space if the lid is too clear and the product color within varies too much. I generally have milliseconds to conduct inspections of large numbers and in pretty big areas, so I’d really like having more information on how they overcome these challenges.

Comments are closed.