Generating sequences of human-like motions for humanoid robots presents challenges in collecting and analyzing reference human motions,
synthesizing new motions based on these reference motions, and mapping the generated motion onto humanoid robots. To address these issues,
we introduce SynSculptor, a humanoid motion analysis and editing framework that leverages postural synergies for training-free human-like
motion scripting. To analyze human motion, we collect 3+ hours of motion capture data across 20 individuals where a real-time operational
space controller mimics human motion on a simulated humanoid robot. The major postural synergies are extracted using principal component
analysis (PCA) for velocity trajectories segmented by changes in robot momentum, constructing a style-conditioned synergy library for
free-space motion generation. To evaluate generated motions using the synergy library, the foot-sliding ratio and proposed metrics for
motion smoothness involving total momentum and kinetic energy deviations are computed for each generated motion, and compared with reference
motions. Finally, we leverage the synergies with a motion-language transformer, where the humanoid, during execution of motion tasks with
its end-effectors, adapts its posture based on the chosen synergy.