Yes, schools should definitely teach children how to grow plants. It's a great way for them to learn about responsibility as they care for a living thing. Plus, it connects them to nature and teaches them about the environment in a hands-on way. It's also a practical application of biology, which can make the subject more interesting and relevant to students.