This week at the ACLU of Maine: Oral arguments in NHICS v. Trump at the federal appeals court, recapping the 2025 legislative session, a save-the-date for an upcoming event in Portland, and more!
NHICS v. Trump, Brought by the ACLU of Maine and Others, Heads to the First Circuit for the U.S. Court of Appeals.
Only hours after Trump's inauguration, the ACLU of Maine and other partner organizations filed a lawsuit challenging the president's executive order restricting birthright citizenship. In February, the federal judge in this case temporarily blocked the executive order from taking effect, but the government appealed to the First Circuit for the U.S. Court of Appeals. On August 1st, arguments were heard at the federal appeals court in Boston. While the case is under review, the judge’s original injunction remains in place, blocking enforcement of the order in all states within the First Circuit.
We filed a second, separate case challenging Trump’s birthright citizenship executive order in June — here’s why:
That same month, the Supreme Court issued a decision limiting the use of nationwide injunctions in three other cases challenging the order. In response, the attorneys who brought NHICS v. Trump filed a new case, Barbara v. Trump. In Barbara, the court certified a class of all babies affected by the executive order and blocked enforcement of the order against them.
This approach — using a nationwide class action — protects all babies born in the U.S. under the order, without relying on a nationwide injunction, which the Supreme Court is now restricting.
Watch our 2025 Legislative Recap Webinar
On Wednesday of this week, we held a live webinar recapping a busy and productive legislative session in Augusta. Our policy director, Michael Kebede, and policy fellow, Alicia Rea, walked through each of the ACLU of Maine's six legislative priorities, highlighting our wins and setting the stage for future work in those areas.
If you missed it, or want to revisit parts of the conversation, we've uploaded a recording of the webinar here.
Save the Date: ACLU National Board President Comes to Portland
We’re excited to welcome Deborah Archer, the ACLU’s national board president, to Portland!
Join us for a conversation with Deborah about her recent book, Dividing Lines: How Transportation Infrastructure Reinforces Racial Inequality. In her book, Deborah offers a critical examination of how transportation infrastructure -- from highways and roads to sidewalks and buses -- has been used to maintain segregation and deepen racial inequality after the fall of Jim Crow.
The event will include an author talk and a panel discussion with local experts exploring transportation justice in Portland and across the country.
The ACLU of Maine will be closed from August 8 - 11 to give our staff a chance to recharge and enjoy the magic of summer in Maine. We'll be back on August 11 to continue our work!
Date
Friday, August 1, 2025 - 2:15pm
Featured image
Show featured image
Hide banner image
Tweet Text
[node:title]
Share Image
Related issues
Immigrants' Rights
Maine State Legislature
Racial Justice
Ami Kachalia, Campaign Strategist, NJ ACLU Policy Department
This blog was originally published by the ACLU of NJ on July 30, 2025.
As announced in a letter from the Department of Defense, the Trump administration plans to use Fort Dix, the U.S. Army post that is part of the tri-service Joint Base McGuire-Dix-Lakehurst, to detain immigrants. The South Jersey military base will hold up to 3,000 beds for use by Immigration and Customs Enforcement (ICE).
With this expansion to Fort Dix, New Jersey continues to be an epicenter of President Trump’s mass deportation agenda. Earlier this year, Delaney Hall, the largest detention facility on the East Coast, opened in Newark, which already multiplied the detention capacity in New Jersey four times over.
The Trump administration’s mass detention apparatus is unprecedented, and employing military resources to detain noncitizens is not normal. The government is pouring money into incarcerating our neighbors, with the most recent federal budget bill funneling $75 billion to ICE for enforcement and detention, often lining the pockets of private prison executives at the expense of humanity, equality, and decency. We can – and must – work to end the criminalization of the immigration system and the mass detention of immigrant communities.
Turning military bases into massive tent detention sites is not only unnecessary and costly; it’s another dehumanizing spectacle designed to intimidate all of us and deprive those detained of their rights. And it diverts important resources needed for military readiness.
It has been widely documented by Congress, oversight agencies, physicians, journalists, advocates, and whistleblowers that the immigration detention system is rife with abuse, dangerous conditions, and medical neglect. And studies have shown that community-based alternatives to detention save taxpayers money and yield better results when it comes to appearing in court. Because Fort Dix is a military base, its operating methods are likely to make the frightening reality of immigration detention in New Jersey even worse.
Our leaders must hold the Trump administration accountable: the ACLU of New Jersey calls on members of Congress to speak out against the inhumane use of military bases as immigration detention centers and to exercise their oversight authority at any such facilities.
Expanding immigration detention to military facilities sets a dangerous precedent for coopting military resources for internal law enforcement and is contrary to everything our nation was created to represent. As the Trump administration’s extreme immigration agenda continues to threaten our communities, the ACLU-NJ will continue to dedicate ourselves to defending the fundamental freedoms of our democracy for all.
Turning military bases into massive tent detention sites is not only unnecessary and costly; it’s another dehumanizing spectacle designed to harm our communities.
The ACLU’s first Civil Rights in the Digital Age (CRiDA) AI Summit in July brought together civil society, nonprofit, academia, and industry leaders to thoughtfully consider how to center civil rights and liberties in a constantly changing digital age.
Privacy statement. This embed will serve content from youtube-nocookie.com.
Leaders and experts from the ACLU, Patrick J. McGovern Foundation, Hugging Face, Amnesty International, Future of Life Institute, Kapor Foundation, Mozilla Foundation and other major organizations discussed how organizations can create an equitable and just future of AI.
Organizations Must Collaborate to Shape the Future of AI
One way experts shared organizations can develop AI responsibly is through partnerships and collaboration. This includes working with civil rights organizations, community participation, and collaborating with other diverse perspectives to provide input on AI policies that are being considered for adoption.
“We need conversations about how AI supports more than just profit, but purpose. And here at the ACLU this morning, we really dug into that question,” Vilas Dhar, president and trustee of the Patrick J. McGovern Foundation, shared. “What are the institutions we have to build and support that protect all of our interests in an AI-enabled age?”
To lead by example, the ACLU has a cross-functional working group of experts representing many business functions within the organization who carry out a holistic findings process when adopting generative AI tools to discover if a tool does not align with ACLU values. This approach ensures that innovation does not leave groups of people behind.
CRiDA experts, like Dhar, also explored why innovation should not be rushed. Technology developers and leaders must take the time to be curious, learn, and engage in collaboration on AI systems’ design, deployment, and evaluation — and carefully evaluate critical issues related to AI’s impact on the environment and privacy, among other topics. While AI is an exciting frontier, organizations can employ tools, such as vendor questionnaires, to identify and understand the risks of specific AI tools and their alignment with an organization’s values related to privacy, fairness, transparency, and accountability.
We Must Protect Our Privacy and Data in the Age of AI
Panelists also discussed a crucial question: What laws regulate facial recognition?
“Not enough,” Nathan Freed Wessler, deputy director of the ACLU Speech, Privacy and Technology project, replied. “In a lot of parts of the country, there's actually no law, nothing from Congress, and most states haven't acted. But there are places that are real leaders...cities have actually banned police from using face recognition technology because it's so dangerous.”
CRiDA experts also explored how AI systems today are trained by developers on vast amounts of data including personal, academic, and behavioral information — often without the consent of the individuals behind this data. The threat doesn’t stop at passive data usage. AI-powered surveillance systems — whether it’s facial recognition or predictive policing — are trained on prejudiced data and used in ways that disproportionately target communities of color, further embedding discrimination into our social reality.
ACLU CRiDA Summit panel members Ijeoma Mbamalu, Vilas Dhar, and Deborah Archer.
Credit: ACLU
This only amplifies the need for transparency and accountability when adopting AI systems. In practice, transparency for organizations considering using AI can include requiring vendors to provide details on the sources of the data used to develop their systems, their measures to assess risks of bias and discrimination, and any guardrails they have implemented to measure and address these risks. These questions are not only based on the need for transparency, but also critical for maintaining equity and fairness.
When developing CRiDA, one of the ACLU’s goals was to highlight the privacy implications of modern AI systems and vast use of personal data to power these systems. Together, experts and leaders across the board agreed that we all need transparency on how and when our data is used.
Ensuring AI Does Not Deepen the Digital Divide
AI has entered everyday life in a variety of ways, from the classroom to the hiring process. We understand that AI can be an asset to an organization’s work if implemented thoughtfully and responsibly. For example, if designed and governed carefully with appropriate guardrails, AI systems could be used to support critical educational and economic opportunities. But at the same AI systems can also have the opposite impact, and we risk exacerbating the racial wealth gap — harming rather than helping marginalized communities — when they are not designed and deployed carefully.
To accomplish a future of tech that is based on fairness and equity, CRiDA experts such as Deborah Archer, president of the ACLU, called on AI system developers to not only include civil society leaders in the room when developing AI, but to also expand diversity, equity, and inclusion, invest in causes that seek to close the digital divide, and fund trainings for marginalized communities to build technology that ensures the next generation have equal opportunity to succeed.
“It's not just enough to say, ‘We want diverse people in the room,’ and ‘We want diverse people to do this work’ if we’re not also doing the work to make sure that all of those people have access to the education, the resources, the opportunities, and the networks that equip them to do the work and then put them in the spaces to take advantage of the opportunities,” said Archer. “So, it is connected to all the other work the ACLU is doing, and other people are doing, around diversity, equity, and inclusion.”
The ACLU has fought both in courts and in communities to address and remedy AI’s systemic harms. We must meet the moment and urge tech, political, and civil society leaders to keep civil rights at the center of AI innovation.
Policies Centering Civil Liberties and Civil Rights in the Digital Age
The future of AI depends on a commitment from our leaders to ensure that AI aligns with the core principles and liberties that our Constitution envisions. Congress passed H.R. 1, the so-called One Big, Beautiful Bill Act, in May. Earlier versions of the bill included a moratorium on state and local laws regulating AI. Luckily, with the support of everyone who called on Congress, the provision was taken out.
Off the heels of this vote, experts and leaders at the ACLU’s CRiDA highlighted how, as AI gains power, now is the time to push for guardrails protecting civil liberties.
“Congress is trying to stop states from passing legislation and other regulations to protect us from AI through a tool called preemption,” Cody Venzke, senior policy counsel with the ACLU National Political Advocacy Division, shared with leaders. “You can keep it out of any future legislation by reaching out to your representatives and senators and tell them: No AI moratoriums and no preemption of state laws.”
The ACLU will continue to fight for responsible AI design and deployments, in the courtroom, and in Congress. Together, with peer organizations, innovators, and advocates, we will continue to use our collective power to protect the digital rights of everyone — especially people from marginalized communities. It must be a priority for all of us, including the institutions and developers building these technologies.
At the ACLU’s Civil Rights in the Digital Age AI Summit, leaders convened to evaluate the civil rights landscape of artificial intelligence and tech, and how we can call for policies that center privacy, fairness, and equity.