Human-robot collaboration applications require perception models that can be updated quickly and enable safe robot motion in dynamic scenes. To this end, we have designed an Interactive Distance Field Mapping and Planning (IDMP) framework that combines dynamic objects and collision avoidance in an efficient representation. We define "interactive mapping and planning" as the process of creating and updating the representation of the scene in real time, while simultaneously planning and adjusting the robot's actions based on this representation. Using depth sensor data, our system creates a continuous field that allows the distance and direction angle to the nearest obstacle to be queried at any position in 3D space. The key aspect of this work is an efficient Gaussian process field that performs incremental updates and implicitly handles dynamic objects with a simple and elegant formulation based on a temporary latent model. IDMP is capable of fusing point cloud data from single and multiple sensors, querying free space with arbitrary spatial resolution, and handling moving objects without semantics. In terms of planning, IDMP enables seamless integration with gradient-based motion planners, allowing for fast replanning for collision-free navigation. Finally, IDMP also makes it possible to determine gripping poses for unknown objects based on their "curvature".