Bipedal robots have gained a lot of locomotion capabilities the past few years, especially in the control level. Navigation over complex and unstructured environments using exteroceptive perception, is still an active research topic. In this paper, we present a footstep planning system to produce foothold placements, using visual perception and proper environment modeling, given a black box walking controller. In particular, we extend a state-of-the-art search-based planning approach (ARA*) that produces 6DoF footstep sequences in 3D space for flat uneven terrain, to also handle rough curved surfaces, e.g. rocks. This is achieved by integrating both a curved patch modeling system for rough local terrain surfaces and a flat foothold contact analysis based on visual range input data, into the existing planning framework. The system is experimentally validated using real-world point clouds, while rough terrain stepping demonstrations are presented on the WALK-MAN humanoid robot, in simulation.