I have been working for a doctor's office for a little over 3 years now. I love the job! My doctor and I work pretty well together. I get along with my coworkers and I love the hours. The only downside is couple years before I started working at the office was the first year the company stopped giving raises to any of the employees. I didnt know this when I got hired. Lots of politics involved I don't want to get in to. The office currently merged with one of the major hospitals in the city. We are hoping this would open up the possibility of pay raises but its not certain. I got bills to pay and kids to support, expensive ones
im not sure if I should find another job that offers pay raise or stick to where I am?!. I know im the only one who can answer this but I would appreciate any input from you.Thanks!