Top Essay Writers
To join our team, we choose only the best writers. They each have expertise in specific topic fields and experience in academic writing.
Are coursework assignment or deadlines stressing you?
We can assist you, we solve problems, answer questions and write papers for top grades
Check our clients testimonials and rest assured you’ll get a high quality, plagiarism-free paper, 100% according to your instructions and deadline!
Posted: December 6th, 2022
1. The ethics paper must be typed and submitted on Canvas for evaluation. This written assignment should be 4-page paper, double-spaced, 11-12 readable font, excluding reference section. Please do not exceed four (4) typed pages.
2. The ethics paper should be typed with one (1) inch margins for the sides, top and bottom. It should contain a heading that includes, title of the assignment, date, and instructor’s name (No cover sheet required). It should include in-text citations pursuant to APA formatting style. The paper should include a reference section at the end of the document (endnotes) with proper citation for each source. For APA guide and APA citation examples please visit https://owl.purdue.edu/owl/research_and_citation/apa_style/apa_style_introduction.html. Also, you can watch https://youtu.be/Q7TBmi8-9G0on how to write in APA style.
3. This paper requires critical thinking and research. In your writing avoid informal verbiage.
4. You may use any materials you want. When you use textbook, articles, reports, supplementary materials or class handouts, you must provide an ACCURATE citation each time you use such materials in your writing. A link to APA citations is provided in your syllabus as well as under item 3 above.
5. Read the facts in part A and questions carefully and outline your answers prior to writing. Organization is Key.
6. There is no single correct answer. If your arguments are persuasive and compelling, based on your outline and organization of the essay, content and proper format you will be eligible to gain all points no matter what your chosen answer to the question is.
Part A: Facts
While new technologies can break the barriers and empower people by combating discrimination and encouraging dignity assurance, there are alerting concerns around the discriminatory outcomes these machines themselves can create. An increasing body of research suggests that algorithms and artificial intelligence aren’t necessarily a panacea for ending prejudice, and they can have While new technologies can help to break down barriers and empower people by combating discrimination and promoting dignity, there are some serious concerns about the discriminatory outcomes that these machines can produce. A growing body of research suggests that algorithms and artificial intelligence are not a panacea for eradicating prejudice, and that they can have disproportionate effects on groups that are already socially disadvantaged.
In terms of artificial intelligencedisproportionate impacts on groups that are already socially disadvantaged.
As Artificial intelligence (AI) algorithms learn patterns in the data, they also absorb biases in it. For example, Google showed more ads for lower paying-jobs to women than to men, Amazon’s same-day delivery bypassed black neighborhoods, and the software on several types of digital cameras struggled to recognize the faces of non-white users. In one of the most striking examples, an algorithm called COMPAS, used by law enforcement agencies across multiple states to assess a defendant’s risk of reoffending, was found to falsely flag black individuals almost twice as often as whites, according to a ProPublica investigation. There are similar concerns about algorithmic bias in facial-recognition technology, which already has a far broader impact than most people realize: Over 117 million American adults have had their images entered into a law-enforcement agency’s face-recognition database, often without their consent or knowledge, and the technology remains largely unregulated. Another example of a data bias emerged in Nikon’s camera software, which misread images of Asian people as blinking, and in Hewlett-Packard’s web camera software, which had difficulty recognizing people with dark skin tones. These challenges are either related to the data itself or to the way algorithms are designed, developed and deployed. Machine learning systems lack transparency and auditability and are almost entirely developed by small, homogenous teams most often of men.
On August 13, 2018, the Assistant Secretary for Fair Housing and Equal Opportunity (“Assistant Secretary”) filed a complaint with the Department of Housing and Urban Development (“HUD”) alleging that Facebook Inc. violated the Fair Housing Act by discriminating because of race, color, religion, sex, familial status, national origin and disability. Subsequently, HUD issued a Charge of Discrimination (“Charge”) on behalf of aggrieved persons following an investigation and a determination that reasonable cause exists to believe that a discriminatory housing practice has occurred.
Facebook is the second largest online advertiser in the United States and is responsible for approximately twenty percent of all online advertising nationwide. Facebook collects millions of data points about its users, draws inferences about each user based on this data, and then charges advertisers for the ability to micro target ads to users based on Facebook’s inferences about them. These ads are then shown to users across the web and in mobile applications. As Facebook explains, its advertising platform enables advertisers to “[r]each people based on zip code, age and gender, specific languages, the interests they’ve shared, their activities, the Pages they’ve liked, their purchase behaviors or intents, device usage and more.” Thus, Facebook “uses location-related information-such as your current location, where you live, the places you like to go, and the businesses and people you’re near to provide, personalize and improve its Products, including ads, for you and others.”
Facebook holds out its advertising platform as a powerful resource for advertisers in many industries, including housing and housing-related services. Such ads include ads for mortgages from large national lenders, ads for rental housing from large real estate listing services, and ads for specific houses for sale from real estate agents.
Facebook has provided a toggle button that enables advertisers to exclude men or women from seeing an ad, a search-box to exclude people who do not speak a specific language from seeing an ad, and a map tool to exclude people who live in a specified area from seeing an ad by drawing a red line around that area. Facebook also provides drop-down menus and search boxes to exclude or include people who share specified attributes. Facebook has offered advertisers hundreds of thousands of attributes from which to choose, for example to exclude “women in the workforce,” “moms of grade school kids,” “foreigners,” “Puerto Rico Islanders,” or people interested in “parenting,” “accessibility,” “service animal,” “Hijab Fashion,” or “Hispanic Culture.” Facebook also has offered advertisers the ability to limit the audience of an ad by selecting to include only those classified as, for example, “Christian” or “Childfree.”
Facebook alone, not the advertiser, determines which users will constitute the “actual audience” for each ad. Facebook structured its ad delivery system such that it generally will not deliver an ad to users whom the system determines are unlikely to engage with the ad, even if the advertiser explicitly wants to reach those users regardless. Facebook uses machine learning and other prediction techniques to classify and group users to project each user’s likely response to a given ad. In doing so, Facebook inevitably recreates groupings defined by their protected class. For example, the top Facebook pages users “like” vary sharply by their protected class, according to Facebook’s “Audience Insights” tool. Therefore, by grouping users who “like” similar pages (unrelated to housing) and presuming a shared interest or disinterest in housing-related advertisements, Facebook’s mechanisms function just like an advertiser who intentionally targets or excludes users based on their protected class.
Part B – Questions
Based on facts provided above, write an essay and answer below questions:
1) Should Facebook be held ethically liable for its biased dwelling advertisements? Why?
2) Identify procedures that help Facebook prevent data bias, ensure data quality and improve its ethical responsibility towards its users. What safeguards should Facebook have in place to maintain the ethical integrity of its AI driven business model?
3) Do you think Facebook must have strict liability? Meaning that all Facebook agents, employees, successors, and all other persons in active concert or participation with it, have participated in discriminating because of race, color, religion, sex, familial status, national origin or disability in any aspect of the sale, rental, use, marketing, or advertising of dwellings and related services must be held liable? Or we should only hold Facebook liable based on theory of negligence?
End of the Ethics Paper Assignment
You Want Quality and That’s What We Deliver
We offer student friendly prices while still having maintaining the best writing standard. Compared to other writing services, our prices are fair and reasonable..
You will never receive a product that contains any plagiarism. We scan every final draft before releasing it to be delivered to a customer.
When you decide to place an order with Write my Nursing Paper, here is what happens:
Place an order in 3 easy steps. Takes less than 5 mins.