A few days ago, I found myself in a good discussion with my PR colleague Allan Schoenberg around the legitmacy of public relations as an industry. Allan was lamenting that yet another network TV series this winter may portray the PR profession in a somewhat negative light. I immediately responded by saying this has been going on for years (Sex in the City, Spin, PoweR Girls, for example) and that it doesn't help our cause as we fight the good fight for a seat at the proverbial "table."
Allan's response surprised me: "I actually think shows like Sex in the City legitimize the PR industry by highlighting it in a very public way" (I'm summarizing--Allan, correct me if I'm wrong).
Hmm...what do you think? See, I tend to think every time one of these shows characterizes a PR professional as a flak or glorified party planner that we lose a little bit of credibility with our clients. They might not come out and say it--heck, they might not be every consciously thinking it. But I think shows like these affect people's perceptions and attitudes toward our industry at the very least.
Now, I know at the core, we need to earn our client's respect by the work we produce and the counsel we offer day in and day out. That's a given. But, I can't help but wonder if these shows still have a negative impact on an industry that's worked so hard to legitimize itself over the years.
What do you think? Are we still seen as party planners and spinsters by our clients? Or, are we making progress? Do the clients we work with see us as true advisors and business partners--the same way they do their attorneys and financial counselors?
Where do you stand?