Portland Copwatch Analyzes "Independent" Police Review Division 2009 Annual Report

Table of contents
Neutral Tone Shows IPR Capable of Objectivity
Slimmer Report, Less Information
Details In, Details Out
Discipline and Repeat Offender Officers
Community Feedback
Correcting the Record
Questions Raised

Portland Copwatch
a project of Peace and Justice Works
PO Box 42456
Portland, OR 97242
(503) 236-3065/ Incident Report Line (503) 321-5120
e-mail: copwatch@portlandcopwatch.org

Despite Frequently Taking a More Neutral Tone
an analysis by Dan Handelman, Portland Copwatch
June 25, 2010

The Independent Police Review (IPR) Division's 2009 Annual Report is a mixed bag of useful and buried information, neutral reporting and public relations. On the whole it is geared less toward touting the IPR's statistics as achievements, a point we brought to attention in our analyses of their 2007 and 2008 reports. There is less of an implication that the drop in complaints, use of force complaints, and officer involved shootings from 2007 to 2009 were the result of IPR's work. However, the IPR still chose to highlight these trends in their Executive Summary pamphlet, while ignoring, for instance, that only one of 27 cases investigated by the Internal Affairs Division (IAD) was completed by the Bureau within the 5-month guideline. The report also continues to bump up certain statistics and trends--such as the "sustain rate"--while leaving in the report's back pages information that the satisfaction rate with IPR has gone down, while dissatisfaction has remained at 50%. Below is analysis of the 2009 report by Portland Copwatch (PCW).

back to table of contents back to top


The "trends and highlights" in the "Report Overview" section (p. 1) are generally presented with neutral language and no implied conclusions. For example, the first point reveals that the number of complaints "continued a downward trend ...from 771 in 2005 to 405 in 2009." The neutral tone is welcome, since there is no way to know, for instance, whether the scathing report issued by consultant Eileen Luna Firebaugh in January, 2008 led to broader community mistrust of IPR, whether that distrust has been multiplied by the lengthy investigation into the police beating death of James Chasse, Jr., or if, as some suggest, the police are simply not committing as many acts of misconduct.

They also state, with no fanfare, that there was only one officer involved shooting and no deaths in custody in 2009, while "there were approximately eight shootings and/or deaths per year from 1997 through 2006." Previous implications that deadly force incidents were dropping thanks to the IPR's work made us caution that when shootings went back up (as they have in 2010), the IPR/Auditor would take the blame.

Later in the report, describing the stages of a complaint, IPR refers to IAD's role as conducting an "administrative investigation," which is a much better term than "disciplinary investigation." The previous term made both the officers and the civilians believe that the outcome would always result in discipline.

There is a downside to the neutrality, which is that by selectively choosing facts, the system appears to be functioning better than it really is.

--Sustain Rate and "Service Improvement Opportunities"

The issue of the "sustain rate," which Luna Firebaugh addressed at length, is a prime example: The report says that "22% of cases fully investigated by the Police Bureau... resulted in one or more sustained finding [that the officer was out of policy]." Looking more closely at that sentence, you see that it refers to 22% of cases investigated. Just above that fact, IPR reports that they only turned over 37% of incoming complaints to IAD, or 140 out of 375 complaints processed.

IAD then rejected 25% of those cases, handled others as minor complaints, and investigated 17% (p. 14).

Here are some alternate, more realistic "sustain rates" depending on how one wishes to count the pool of possible results:
--8.1%, if the 13 cases with one or more sustained findings are compared to the 160 cases handled by IAD (pages 13 & 16);
--3.5%, if the 13 cases are looked at in the pool of 375 complaints processed by IPR in 2009 (p. 6);
--3.2%, if gauged the way Luna Firebaugh suggests, against the pool of all 405 complaints made in 2009 (p. 5);
--2.8%, if gauged against all 464 cases closed by IPR in 2009 (p. 5).

We'd generously go with the 3.5% figure, and note that because of the overall decline in incoming complaints, this year's unrealistically high 22% number from IPR is the least deceiving since 2002: while in other years their "sustain rate" has been 12-16 times too high, this year it is only about 7 times too high.

The section on "Bureau Initiated Complaints" includes the figure that 23 officers had discipline imposed on them-- but we would hope that was the combined result of Bureau and community complaints. Otherwise the implication is that discipline is only imposed when one officer turns in another for suspected misconduct.

Again, the percentages on Bureau complaints are mildly misleading, though sustained allegations are indeed quite high at 58% (p. 17). The report claims 62% of cases were resolved with at least one sustained finding, but that doesn't reflect all cases filed-- while 34 were closed, a total of 48 were opened. Thus, the sustain rate on Bureau-Initiated Complaints is really 44%, again, still relatively quite high. This again leads to the question of whether officers are believed more than civilians. In this instance, officers filing the complaints seem to have more credibility than officers under scrutiny, who in turn appear to have more credibility than civilian complainants.

"Service Improvement Opportunities" (SIOs) have climbed from being used by IAD 34-54% of the time in 2002-2006, to 51-60% 2007-2009 (p. 14). These minor complaints (which would be a better name for "Service Complaints" than "SIOs") are for violations of policy that normally do not rise to the level of discipline. The 2008 report (p. 19) explicitly states that SIOs are not considered discipline, but that is not clear in the new report.

Disparate treatment, law enforcement treating someone differently or otherwise using race inappropriately in a police action, is one of the most serious offenses an officer can commit, yet only one racial profiling/disparate treatment case has been sustained since 2002 (in 2007). While the community might expect this behavior to result in discipline, ten racial profiling cases were handled as minor complaints/SIOs in 2009 (p. 14), and seven in 2008 (2008 p. 19). It is understandably difficult to prove discrimination without evidence such as the use of a racial epithet, so perhaps many of these cases, if investigated, would be given an "Unproven" finding. However, if the complaints are being resolved with a supervisor talking to the officer, it implies there is substance to these allegations. A full finding of "Unproven with a debriefing" is more serious on an officer's record than a "Service Improvement Opportunity." In addition to the seriousness of profiling, the use of minor complaints/SIOs to resolve such complaints went up from being #8 most frequent to #3 this past year.

back to table of contents back to top


The 2009 report has been slimmed down and made somewhat easier to read than past efforts. The downside is that some of the information that was previously discussed or presented in the body of the report is now buried in the appendix or missing from the publication.

One table that is sorely missed showed the combined number of cases dismissed by IPR, assigned for investigation, and declined by IAD (2008, p. 14). By using the 2009 numbers, PCW was able to determine that only 27 of the 375 cases processed, or 7.2%, were investigated. This is down from 2006-2008 at 9.6, 9.7 and 8.9%. In other words, the odds of a citizen's complaint getting an investigation went from about one in 10 to about one in 14.

--Lack of timeliness glossed over

While we welcome the deletion of the multiple graphics showing timeliness at different stages of the investigation process, there is no substantive discussion of the time it takes to process complaints, an area of low complainant satisfaction (pp. 31 & 33), as well as one issue that led to the creation of IPR in 2001. In fact, both the Majority and Minority reports of the 2000 Mayor's Work Group on PIIAC listed timeliness as an important factor.

As mentioned above, only one of 27 investigations was completed by the Bureau within a five month timeline, with none completed in four months (p. 38). It's not clear exactly how long most investigations take at Internal Affairs; the chart says that 44%, or 12 of 27, were finished in 10 weeks. It does not say how long the other 56% took. The main holdup, though, seems to be waiting for Commanders to attach findings to the cases, which happened within three months only 18% of the time, presumably in 5 of 27 complaints. Considering that a shorter 45-day timeline was met 50% of the time in 2008, this seems to merit serious discussion. However, the only two comments on this trend, both buried in the appendix, are:
(1) "Fully investigated cases frequently exceed the timelines" and
(2) "Other measures for IAD and Police Bureau management suggest that timeliness was an issue in 2009" (both on p. 38).

The fact that IPR generally completes 90% of its cases within 150 days (p. 6) is irrelevant if roughly 10% of cases are investigated and those are the ones falling outside the benchmarks.

--Other missing useful information

Other charts, graphics and text have been removed that would make the report more user-friendly to those not familiar with IPR. For instance, there used to be a flowchart showing the basic steps of the complaint process (2008, p. 2) and a summary of the history of IPR (2008, p. 1) which if nothing else could be put in the appendix.

Both statistical information and a chart comparing the number of complaints to the overall number of police contacts is now missing (2008, p. 8). Though we're not convinced this is the most meaningful statistic, it is helpful to compare Portland to other cities, and track Portland's complaint rate from year to year.

Other tables that helped clarify the process were one showing the complaint categories (force, conduct, control technique, courtesy, procedure) and another showing the possible decisions IPR can make (dismissal, mediation, investigation, referral to other agency), which are detailed Chapter 2. Both appeared on page 4 of the 2008 report.

On page 7, a report of the mediation process does not reveal how many cases last year were successfully discussed by complainants and officers with a professional mediator. These statistics were in the 2008 report on p. 16, and included the various outcomes including how many mediations were pending.

--Misleading and missing information about the outcome ("findings") assigned to misconduct allegations

One of the most confusing aspects of the report, as illustrated above in the discussion of the "sustain rate," is just how many complaints are processed every year. The IPR has explained to us that because of lag time between the incoming calls and the case handling at IPR and/or IAD, the numbers in the report do not always add up. One example is that IPR turned over 140 cases to IAD in 2009 (p. 6), but IAD processed 160 cases (p. 14). IPR could lessen the confusion by explaining this in the report, and by continuing to publish the "complaints closed" chart (2008, p. 8), which did not appear in the new report.

Another measure of a civilian review board's effectiveness is how many times officers are "exonerated," or found within Bureau policy, as opposed to an "insufficient evidence"-type finding where there was not enough evidence to prove or disprove the complaint. This is important because many cases are "he said-she said," yet when there are more "exonerated" findings, it implies that the police testimony is given more weight than the civilian's. This has been a problem in Portland, as seen in the 2008 report on p. 20, in the new report, showing 42% exonerated (p. 16), and in previous reports. The exonerated rate was 27-38% in 2002-2004, but 35-43% in 2005-2009.

The "insufficient evidence" rate is hard to gauge, since that finding was combined with "unproven" in 2007. Prior to that, "insufficient evidence" only made up 12-25% of all findings. Unfounded, meaning the incident did not happen the way the complainant alleges, was used 25-41% of the time. The new "Unproven" finding, which combines these last two findings, was used 47% of the time in 2008 and 51% of the time in 2009. IPR re-inserting the table showing how these findings are used across time, in addition to the Bureau restoring the previous findings and adding "training failure," "supervisory failure" and "policy failure," would go a long way to increasing the public's ability to analyze whether the IPR system weighs too heavily in the police officers' favor.

--How often Racial Profiling and Use of Force were alleged is not highlighted

The report gives some indication of the frequency of certain specific kinds of complaints, but moves the details on general trends to the appendix (table 8, p. 37). Chapter 2 discusses the top community allegations but has chopped the list down from the top 8 (2008, p. 19) to the top 5 (p. 5). Chapter 3 gives the top 5 allegations made by Bureau members (p. 17). The "top allegations" is another category that would be useful to compare across time.

What we learn by looking at previous reports is that Force allegations were down from #2 to #3 from the public, and up from #5 (2008, p. 26) to #3 at the Bureau, comprising 14% of Bureau- initiated complaints. In 2002, Force was the #1 citizen complaint, and #2 in 2003-2006. (Figures are not available from the 2007 report, and the 2005-06 report only shows Rudeness and Force.) Racial Profiling was up from #5 to #4 from the public, but represented only 2% of complaints from the Bureau (p. 27). This is actually an uptick, as no Profiling was alleged by Bureau members in 2006- 2008. In 2003 and 2004, public complaints of "harassment" was listed as #3 and #6, respectively, while "discrimination" was #13 and #14. "Profiling" was at #17 in 2004, when a much longer list of allegations was provided.

Though it wasn't in the top 8 last year, False Arrest is way up this year to #5, a position it held in 2004. False Charges was #2 in 2003. The top two complaints of 2009 were Rudeness and Failure to Act (p. 5).

Force and Profiling (/Disparate Treatment) are shown as percentages of all complaints in table 8, with Force climbing slightly to 7% of complaints, up from 6% in 2008 after holding steady at 8% for many years. Disparate Treatment has been steady at about 5% each year. A reference to this table by number in the text of the report would be useful, as well as referring to the table on "Who Files Complaints" (table 6, p. 35), since both are in the appendix. These tables were included in the body of the report in 2008 (pages 10 and 11, respectively).

Also missing in the "Bureau-Initiated Complaints" section is a breakdown showing all allegations made and their outcomes: for instance, in 2008, three allegations of "use of position for personal gain" were sustained (p. 26).

--Shootings and Deaths

A table showing how many shootings and deaths were fatal or not (2008, page 28) is very helpful and should not be missing. While the graphic showing the number of shootings and deaths per year now matches other trend graphs by only reaching back five years (p. 21), we've noted before that the previous graphs that stretched to 1997 should have gone back to at least 1995, when there was only one shooting. If it were possible to analyze trends in police shootings, it should be done over a longer period of time. Also, since the PARC report stopped printing statistics about the race and gender of shooting victims and officers after its first report in 2003, IPR should publish that information in its reports.

--Tort Claims

One of IPR's best actions since its inception was to begin reviewing police misconduct tort claims (notices of civilians' intent to sue the city) to see whether allegations made are worth pursuing as complaints. The theory is that proving misconduct might hurt the city financially in the short run, but prevent future actions in the long run. Once again, the inclusion of information about IPR 's handling of tort claims is helpful. However, IPR reportedly opened investigations into just seven cases of 165 claims (4%, p. 19), but does not report this year on whether IAD followed up by conducting full investigations on any of these cases. IPR opened investigations on 13 cases in 2008 (of 163, or 8%), yet only one of those cases was investigated by IAD (2008 p. 23).

In addition to a chart showing what happens to the opened investigations, it would be useful to track the reasons IPR does not open case files on tort claims across time. The percentages of claims for reimbursement (45-55%) and other reasons were relatively steady over the past two years. Portland Copwatch continues to be disturbed, however, by IPR declining to look into allegations of police misconduct that "were explained by police reports" or which, in an initial legal claim, supposedly contain "insufficient evidence."

--Also missing

...A report on the Use of Force Review Board, which in last year's report (p. 29) was an excellent start to making that internal review system more transparent, and should be included again.

.....There is no longer any discussion about the number of complaints by precinct, including a table (page 9, 2008). This is fairly standard in reports from around the country and the information should be shared with the public.

...Biographical information for the Citizen Review Committee (CRC) members. Other cities also include short biographies of key staff.

...The only mention of the IPR's quarterly reports, which give some details about individual cases and policy issues, is on page 27 under the heading "CRC Workgroups." These reports should be directly referenced as sources to find more information.

back to table of contents back to top


In a few places, the report does give details that more concretely demonstrate both the issues raised by complainants and the workings of the complaint system. In other areas, vague descriptions actually do disservice to the IPR and CRC's work.

--The good:

...There is a specific example cited on page 14 of the IPR sending a previously declined case back to IAD to investigate allegations including force and improper stop/search.

...On page 15, there is a detailed example of the IPR sending a case back to IAD for further investigation involving a violation of the Bureau's foot pursuit policy, which resulted in a proposed "sustained" finding. It is unclear, however, whether that finding led to any discipline.

...The note under "Officer-Involved Shootings and In-Custody Deaths" regarding the Auditor hiring outside experts to review the death of James Chasse (p. 21) is a terrific step forward for IPR. Not only is it significant, as they noted, that the review began before the family's civil case was settled, but it is the first time we can recall the IPR using the name of a police shooting/death in custody victim in an annual report. We encourage more of this in the future.

...Information that Taser use has not gone down despite overall declines in reported use of force (p. 19) is the kind of information that should be highlighted throughout the report.

--The missing:

...In past annual reports, each appeal heard by the Citizen Review Committee included details about the allegations made and specifically what changes CRC proposed to Bureau findings. This year, there are no case summaries, and descriptions such as "The complainant appealed all four allegations in this case" (pp. 9-10) are relatively meaningless. While this information may be available elsewhere, the annual reports are the only place to see the CRC's work cumulatively summarized.

...Discussion of the Force Task Force's work refers to 16 recommendations made and analyzed in its 2007 and 2009 reports, but gives no examples. Nor does it illustrate the claim that "injuries to officers and subjects declined" (p. 19). Brief summary details would be useful for those who do not have time to track down the full report.

back to table of contents back to top


It is understandable to a point that officer discipline is a personnel matter, and can be shielded from public records laws, however, the IPR could do a better job reporting on officer discipline as well as officers who have received multiple complaints over time.


The discipline table (p. 18) would be more meaningful if it described the actions for which the officers received time off, were terminated or resigned/retired. This would help show what kinds of misconduct lead to these outcomes. Most of the time, if officers are terminated or forced to resign due to criminal charges, the information has been in the newspaper. However, answering questions for the community and Bureau members as to what kinds of serious misconduct lead to six officers leaving the force and 22 other officers receiving discipline could help increase trust and prevent future occurrences.

For example, it was widely publicized that Sgt. Kyle Nice and Officer Christopher Humphreys were given 80 hours of unpaid leave for failing to bring James Chasse to the hospital following use of a Taser. Presumably, these are two of the five officers who received 10 to 150 hours off in the table. What prompted the three other disciplinary actions, and the two that received over 150 hours off?

Reporting this information can also close the loop of the IPR's reports: the number of sustained findings is equal to reported disciplinary actions from 2005-2008. However, there are 12 more cases with sustained findings in 2009 than disciplinary actions, presumably because disciplinary action is pending in those cases. It is possible that more than one officer has been disciplined for a sustained finding, if the allegation was leveled against multiple officers. Because that information is not explicit in the report, it is hard to know how many times sustained findings result in discipline. This is particularly important because officers have the ability to use "mitigation," a one on one meeting with the Chief, to have their discipline overturned for personal or other reasons.

--Officers with multiple complaints

The IPR's selective use of information leaves a confused picture about whether the complaint system is doing anything to curb repeat offender officers. IPR makes a point that since 2006, no officer has shown up twice on the top 5 complaint list (page 18), yet admits one was on the list in both 2005 and 2007 (a 2 year spread) and another in 2005 and 2009 (a 4 year spread). In other words, someone from 2006-2009 could show up again in 2010-2013 and this analysis means only that officers who receive multiple complaints tone down their behavior for two to four years before acting up again.

In addition, the 2008 report (p. 27) refers to one officer who had 30 complaints in 2003-2005, then reappeared in the multiple complaint list in 2008. Another officer, discussed in the new report, received 14 Use of Force complaints in five years, and had two new Force complaints this year, even though the 2008 report says this officer was reassigned and subjected to a "behavior review" to reduce his/her use of force (2009 report, p. 20, 2008 report p. 31-32).

back to table of contents back to top


It is useful that the IPR has saved room in the report by collapsing the questions on their feedback survey into two tables (pages 31 & 33), rather than 11 separate graphs. It would be more transparent, though, to include the survey information as part of the body of the report rather than burying it in the appendix.

Overall satisfaction, they note, has gone down, to 37% from 44%. Dissatisfaction, noted on a separate table, held steady at 50%. It cannot be overlooked that the IPR has never received over 50% satisfaction rate on its own survey, or in the more generally worded Auditor's survey (p. 34).

We do appreciate that the IPR apparently doubled the number of people receiving surveys, and producing double the response--444 surveys sent (up from 197) resulted in 75 responses (up from 35). We also appreciate that the IPR's caution about the surveys takes a more neutral tone than in the past. Rather than implying that only "sour grapes" fill out the surveys, the new report merely notes that "it is very difficult to determine the degree to which the complainants who responded... are similar to (or different from) the 80% who did not respond" (p. 32).

If one more change could be made, there should another table which compares the dissatisfaction rates over a five year period of time, since only satisfaction rates are reported that way in the 2009 report.

back to table of contents back to top


In several places, the IPR Annual Report gives out misleading information that could give outsiders the wrong idea about what IPR and CRC actually do.

...As in the past, the report states, accurately, that IPR can conduct independent investigations (twice--pages 3 & 7), although this has never happened in 8-1/2 years, a fact pointed out by Ms. Luna Firebaugh and many community members.

...IPR is dismissing complaints on "the likelihood that the alleged misconduct cannot be proven" (p. 11). However, the guidelines only allow dismissal if it is likely that misconduct did not occur (p. 39). Whether or not it can be proven can't be determined until an investigation is done.

...In a related point, IPR admits they are dismissing more cases because the complaint does not allege any misconduct. This category is up from 24-42% of dismissals in 2002-2006 to 62% in 2009 (p. 11).

...The report categorizes votes by the Citizen Review Committee to "challenge" Police Bureau findings that are not supported by the evidence as "rejecting" the findings, a harsher term than is formally used (p. 8).

...CRC is described by IPR in the report (and elsewhere) as being an "advisory body to the City Auditor and the Independent Police Review" (p. 27), yet CRC directly advises the Bureau and City Council through the appeals process. What little power this civilian body is given should not be undermined by carelessly chosen language.

...Describing City Council's role in the appeals process, which has only been invoked one time since 2002, the report says that their vote is a "final, binding decision" (p. 8). However, we believe that under "union" contract terms, the officer can still use a "mitigation" hearing to overturn a sustained finding made by Council. Language in the 2008 report was more accurate: "If City Council changes the findings, the Chief of Police is required to adopt the Council's findings and determine what discipline, if any, should be imposed" (2008 p. 6).

...While it is commendable that IPR staff underwent cultural competency training (p. 24), it should not go without notice that this was done in response to the interim Bias Based Policing report from the Citizen Review Committee.

back to table of contents back to top


...Since complaints against officers are rejected some 70% of the time, we wonder whether commendations are automatically entered into officers' records (the "Employee Information System"). The 2008 report (p. 60) explained that the Bureau now tracks the commendations, but since IPR handles incoming commendations (p. 3), the statistics and an explanation should appear in the Annual Report. Part of that explanation should include whether commendations are investigated to be sure someone isn't just trying to bolster the career of a friend or relative.

...Sometimes IPR refers dismissed cases to Precinct Commanders for review, yet these "Precinct Referrals" are not formally documented (p. 11). Why not?

...If the IPR approved 40 IAD investigations and sent back eight cases for more investigation (p. 15), why is the total number of cases closed by IAD 58 (p. 16), or ten higher? We assume these were cases carried over from 2008, but the report should be explicit to avoid confusing readers.

...If, according to the report's description of the chain of command, proposed findings by an officer's commander can be challenged ("controverted") by the IPR Director, the IAD Captain or an Assistant Chief before heading to the Performance/Use of Force/Police Review boards (p. 15), isn't this an argument as to why the officer's commander should not be a voting member of those boards?

back to table of contents back to top


Portland Copwatch continues to welcome the IPR's more timely release of their annual report, and applauds the efforts to make the report more readable and objective. We hope that in the future, the other recommendations we have set forth will help make Portland's system of reviewing allegations of officer misconduct more transparent, and lead to reductions in corruption, brutality and racism in our Bureau.

back to table of contents back to top
Portland Copwatch home page
Peace and Justice Works home page

Posted June 25, 2010, updated June 4, 2011