[…] By combining artificial intelligence with precision laser technology, companies like Carbon Robotics are reshaping the way farmers tackle one of agriculture’s most labor-intensive tasks. These futuristic machines offer a glimpse into the potential of sustainable farming, where innovation meets efficiency, paving the way for a healthier and more productive future for agriculture.
You already know what Black Rock and the defense industry are thinking.
But where will the invaders work??? Washington gardens???
“These futuristic machines offer a glimpse into the potential of sustainable farming, where innovation meets efficiency, paving the way for a healthier and more productive future for agriculture.”
Reads like the press for every cockamamie scheme that comes out of the Urban planning division in every jurisdiction in Washington State that is infested with University of Washington Urban Planning School graduates. It’s formulaic and relies on a rearrangement of the same buzz words.
https://youtu.be/tJF9k1R0bPc?si=o2rWu2xqomU04H4W
Can it be set for leftists?
@JDH “It’s formulaic and relies on a rearrangement of the same buzz words.”
On the leading edge of the foreskin of technology 🙂
Still not fast enough to keep the dandelions out of my yard.
Imagine what it could do with illegal aliens.
That was as interesting to watch as it was to visualize how it works. I hope it works on pest insects and not desirable or neutral ones.
what is the criteria for getting the laser?
no mention of that. How does it discriminate?
It is good in theory, but I’d like to know it doesn’t kill everything that isn’t ’roundup ready’
An unsupervised, experimental, large machine with lasers deployed 24/7 in every kind of weather around dinky, unlit rural roads.
What could possibly go wrong?
Farming 101
The real purpose behind GMO corn and soybeans or Round-Up Ready crops, is money.
The seed and all subsequent generations of those seeds are copyright protected. If you get caught replanting that seed the next year, you will be charged with copyright rules and banned from buying seed in the future.
It was common prior to GMO seed to reuse the seed harvested the previous year to plant the following year, reducing expenses.
My dad did it for decades. He would research new seed brands and add in maybe 10% of new seed each year. Maybe the yield was slightly lower, but it cost $4-5000 less to plant.
I’ve worked a lot with machine vision. Machines dont see the world as we do, and lighting is everything. Its tough to have a lighting scheme that works from full dark to bright sun to random flashes of lightning, and feed this information back to a database that could differentiate weeds in every stage of development from desirable crops in every stage of development. Whether doing pattern recognition, color recognition, blob location differentiation, or some combination, machine visual recognition is largely dependant on consistency and God’s Creations are all inconsistent individuals that come in different shades and shapes even in the same species.
Also, best case, you’re burning the tops off the weeds. Anyone who’s ever had dandelions knows very well that you have to get the root, and this doesnt do that.
Good luck with that.
@SNS — I wonder what possibilities exist for weed recognition in non-visible light wavelengths? More data to work with could mean better discrimination.
I want a smaller residential model optimized for Florida lawn sedge.
Uncle Al
Sunday, 24 November 2024, 17:19 at 5:19 pm
“@SNS — I wonder what possibilities exist for weed recognition in non-visible light wavelengths? ”
…I’ve worked in IR and UV spectrum applications with polarizing and color filters too, and this sometimes solves some problems but causes others. UV has some obvious issues in bright daylight, and Ive had even laser scanners outside visible wavelengths get dazzled by bright sunlight shafts, and the more you obstruct your optics with various filters, the more issues you have with external interference such as dust or smoke.
Are farms dusty?
…also, this again causes consistency problems for pattern/color matching. A plant covered in dust or splattered in mud after a hard rainfall, or even in bird droppings, may not match the database model on either of those terms. Outdoor farming conditions dont lend themselves to consistency no matter what term you are modeling on.
Also, plants grow. This can change the distance to the optics and so change the image presentation, as will the way the plant morphs as it grows and flowers or seeds.
That machine will have to do a ton of image processing somehow before you can even begin to think about targeting.