Nowadays, robots are expected to enter in various application scenarios and interact with unknown and dynamically changing environments. This highlights the need for creating autonomous robot behaviours to explore such environments, identify their characteristics and adapt, and build knowledge for future interactions. To respond to this need, in this paper we present a novel framework that integrates multiple components to achieve a context-aware and adaptive interaction between the robot and uncertain environments. The core of this framework is a novel self-tuning impedance controller that regulates robot quasi-static parameters, i.e., stiffness and damping, based on the robot sensory data and vision. The tuning of the parameters is achieved only in the direction(s) of interaction or movement, by distinguishing expected interactions from external disturbances. A vision module is developed to recognize the environmental characteristics and to associate them to the previously/newly identified interaction parameters, with the robot always being able to adapt to the new changes or unexpected situations. This enables a faster robot adaptability, starting from better initial interaction parameters. The framework is evaluated experimentally in an agricultural task, where the robot effectively interacts with various deformable environments.