Following my previous post about how groundspeed compares to Firebug and the Web Developer extension, another interesting question is how Groundspeed compares to client-side proxies (Paros, Burp, etc.) and other tools that modify HTTP requests like TamperData.
The data we manipulate when performing input validation usually originates at the application user interface (the forms). At the user interface input data makes sense, because the interface was designed to be used by people. The page labels and text provide context to the information.
When we work at the HTTP level (using client proxies or TamperData) we lose the context provided by the interface. Even if the HTTP parameters act sometimes as pseudo-labels, they were not meant to be read by people and could be any random string. At the HTTP level we have to map each parameter to what it actually means and for data coming from the interface this means mapping them back to the form elements so we can use the labels as context.
This problem may be easy to solve for simple web applications (one HTTP request matches one form and form elements match parameters one to one), but is much more complicated for complex applications (more client-side logic, lack of synchronism).