Artificial Intelligence in City Government: From Adoption to Accountability

A Practical Framework for Innovation, Oversight, and Public Trust

A collaboration between Lewis McLain & AI – A Companion to the previous blog on AI

Artificial intelligence has moved from novelty to necessity in public institutions. What began as experimental tools for drafting documents or summarizing data is now embedded in systems that influence budgeting, service delivery, enforcement prioritization, procurement screening, and public communication. Cities are discovering that AI is no longer optional—but neither is governance.

This essay unifies two truths that are often treated as competing ideas but must now be held together:

  1. AI adoption is inevitable and necessary if cities are to remain operationally effective and fiscally sustainable.
  2. AI oversight is now unavoidable wherever systems influence decisions affecting people, rights, or public trust.

These are not contradictions. They are sequential realities. Adoption without governance leads to chaos. Governance without adoption leads to irrelevance. The task for modern city leadership is to do both—intentionally.

I. The Adoption Imperative: AI as Municipal Infrastructure

Cities face structural pressures that are not temporary: constrained budgets, difficulty recruiting and retaining staff, growing service demands, and rising analytical complexity. AI tools offer a way to expand institutional capacity without expanding payrolls at the same rate.

Common municipal uses already include:

  • Drafting ordinances, reports, and correspondence
  • Summarizing public input and staff analysis
  • Forecasting revenues, expenditures, and service demand
  • Supporting customer service through chat or triage tools
  • Enhancing internal research and analytics

In this sense, AI is not a gadget. It is infrastructure, comparable to ERP systems, GIS, or financial modeling platforms. Cities that delay adoption will find themselves less capable, less competitive, and more expensive to operate.

Adoption, however, is not merely technical. AI reshapes workflows, compresses tasks, and changes how work is performed. Over time, this may alter staffing needs. The question is not whether AI will change city operations—it already is. The question is whether those changes are guided or accidental.

II. The Oversight Imperative: Why Governance Is Now Required

As AI systems move beyond internal productivity and begin to influence decisions—directly or indirectly—oversight becomes essential.

AI systems are now used, or embedded through vendors, in areas such as:

  • Permit or inspection prioritization
  • Eligibility screening for programs or services
  • Vendor risk scoring and procurement screening
  • Enforcement triage
  • Public safety analytics

When AI recommendations shape outcomes, even if a human signs off, accountability cannot be vague. Errors at scale, opaque logic, and undocumented assumptions create legal exposure and erode public trust faster than traditional human error.

Oversight is required because:

  • Scale magnifies mistakes: a single flaw can affect thousands before detection.
  • Opacity undermines legitimacy: residents are less forgiving of decisions they cannot understand.
  • Legal scrutiny is increasing: courts and legislatures are paying closer attention to algorithmic decision-making.

Oversight is not about banning AI. It is about ensuring AI is used responsibly, transparently, and under human control.

III. Bridging Adoption and Oversight: A Two-Speed Framework

The tension between “move fast” and “govern carefully” dissolves once AI uses are separated by risk.

Low-Risk, Internal AI Uses

Examples include drafting, summarization, forecasting, research, and internal analytics.

Approach:
Adopt quickly, document lightly, train staff, and monitor outcomes.

Decision-Adjacent or High-Risk AI Uses

Examples include enforcement prioritization, eligibility determinations, public safety analytics, and procurement screening affecting vendors.

Approach:
Require review, documentation, transparency, and meaningful human oversight before deployment.

This two-speed framework allows cities to capture productivity benefits immediately while placing guardrails only where risk to rights, equity, or trust is real.

IV. Texas Context: Statewide Direction on AI Governance

The Texas Legislature reinforced this balanced approach through the Texas Responsible Artificial Intelligence Governance Act, effective January 1, 2026. The law does not prohibit AI use. Instead, it establishes expectations for transparency, accountability, and prohibited practices—particularly for government entities.

Key elements include:

  • Disclosure when residents interact with AI systems
  • Prohibitions on social scoring by government
  • Restrictions on discriminatory AI use
  • Guardrails around biometric and surveillance applications
  • Civil penalties for unlawful or deceptive deployment
  • Creation of a statewide Artificial Intelligence Council

The message is clear: Texas expects governments to adopt AI responsibly—neither recklessly nor fearfully.

V. Implications for Cities and Transit Agencies

Cities are already using AI, often unknowingly, through vendor-provided software. Transit agencies face elevated exposure because they combine finance, enforcement, surveillance, and public safety.

The greatest risk is not AI itself, but uncontrolled AI:

  • Vendor-embedded algorithms without disclosure
  • No documented human accountability
  • No audit trail
  • No process for suspension or correction

Cities that act early reduce legal risk, preserve public trust, and maintain operational flexibility.

VI. Workforce Implications: Accurate and Defensible Language

AI will change how work is done over time. It would be inaccurate and irresponsible to claim otherwise.

At the same time, AI does not mandate immediate workforce reductions. In public institutions, workforce impacts—if they occur—are most likely to happen gradually through:

  • Attrition
  • Reassignment
  • Retraining
  • Role redesign

Final staffing decisions remain with City leadership and City Council. AI is a tool for improving capacity and sustainability, not an automatic trigger for reductions.

Conclusion: Coherent, Accountable AI

AI adoption without governance invites chaos. Governance without adoption invites stagnation. Cities that succeed will do both—moving quickly where risk is low and governing carefully where risk is high.

This is not about technology hype. It is about institutional competence in a digital age.


Appendix 1 — Texas Responsible Artificial Intelligence Governance Act (HB 149)

Legislature Online

                                                   H.B. No. 149

AN ACT

relating to regulation of the use of artificial intelligence systems in this state; providing civil penalties.

BE IT ENACTED BY THE LEGISLATURE OF THE STATE OF TEXAS:

SECTION 1.  This Act may be cited as the Texas Responsible Artificial Intelligence Governance Act.

SECTION 2.  Section 503.001, Business & Commerce Code, is amended by amending Subsections (a) and (e) and adding Subsections (b-1) and (f) to read as follows:

(a)  In this section:

(1)  “Artificial intelligence system” has the meaning assigned by Section 551.001.

(2)  “Biometric identifier” means a retina or iris scan, fingerprint, voiceprint, or record of hand or face geometry.

(b-1)  For purposes of Subsection (b), an individual has not been informed of and has not provided consent for the capture or storage of a biometric identifier of an individual for a commercial purpose based solely on the existence of an image or other media containing one or more biometric identifiers of the individual on the Internet or other publicly available source unless the image or other media was made publicly available by the individual to whom the biometric identifiers relate.

(e)  This section does not apply to:

(1)  voiceprint data retained by a financial institution or an affiliate of a financial institution, as those terms are defined by 15 U.S.C. Section 6809;

(2)  the training, processing, or storage of biometric identifiers involved in developing, training, evaluating, disseminating, or otherwise offering artificial intelligence models or systems, unless a system is used or deployed for the purpose of uniquely identifying a specific individual; or

(3)  the development or deployment of an artificial intelligence model or system for the purposes of:

(A)  preventing, detecting, protecting against, or responding to security incidents, identity theft, fraud, harassment, malicious or deceptive activities, or any other illegal activity;

(B)  preserving the integrity or security of a system; or

(C)  investigating, reporting, or prosecuting a person responsible for a security incident, identity theft, fraud, harassment, a malicious or deceptive activity, or any other illegal activity.

(f)  If a biometric identifier captured for the purpose of training an artificial intelligence system is subsequently used for a commercial purpose not described by Subsection (e), the person possessing the biometric identifier is subject to:

(1)  this section’s provisions for the possession and destruction of a biometric identifier; and

(2)  the penalties associated with a violation of this section.

SECTION 3.  Section 541.104(a), Business & Commerce Code, is amended to read as follows:

(a)  A processor shall adhere to the instructions of a controller and shall assist the controller in meeting or complying with the controller’s duties or requirements under this chapter, including:

(1)  assisting the controller in responding to consumer rights requests submitted under Section 541.051 by using appropriate technical and organizational measures, as reasonably practicable, taking into account the nature of processing and the information available to the processor;

(2)  assisting the controller with regard to complying with requirements relating to the security of processing personal data, and if applicable, the personal data collected, stored, and processed by an artificial intelligence system, as that term is defined by Section 551.001, and to the notification of a breach of security of the processor’s system under Chapter 521, taking into account the nature of processing and the information available to the processor; and

(3)  providing necessary information to enable the controller to conduct and document data protection assessments under Section 541.105.

SECTION 4.  Title 11, Business & Commerce Code, is amended by adding Subtitle D to read as follows:

SUBTITLE D.  ARTIFICIAL INTELLIGENCE PROTECTION

CHAPTER 551.  GENERAL PROVISIONS

Sec. 551.001.  DEFINITIONS.  In this subtitle:

(1)  “Artificial intelligence system” means any machine-based system that, for any explicit or implicit objective, infers from the inputs the system receives how to generate outputs, including content, decisions, predictions, or recommendations, that can influence physical or virtual environments.

(2)  “Consumer” means an individual who is a resident of this state acting only in an individual or household context.  The term does not include an individual acting in a commercial or employment context.

(3)  “Council” means the Texas Artificial Intelligence Council established under Chapter 554.

Sec. 551.002.  APPLICABILITY OF SUBTITLE.  This subtitle applies only to a person who:

(1)  promotes, advertises, or conducts business in this state;

(2)  produces a product or service used by residents of this state; or

(3)  develops or deploys an artificial intelligence system in this state.

Sec. 551.003.  CONSTRUCTION AND APPLICATION OF SUBTITLE.  This subtitle shall be broadly construed and applied to promote its underlying purposes, which are to:

(1)  facilitate and advance the responsible development and use of artificial intelligence systems;

(2)  protect individuals and groups of individuals from known and reasonably foreseeable risks associated with artificial intelligence systems;

(3)  provide transparency regarding risks in the development, deployment, and use of artificial intelligence systems; and

(4)  provide reasonable notice regarding the use or contemplated use of artificial intelligence systems by state agencies.

CHAPTER 552.  ARTIFICIAL INTELLIGENCE PROTECTION

SUBCHAPTER A.  GENERAL PROVISIONS

Sec. 552.001.  DEFINITIONS.  In this chapter:

(1)  “Deployer” means a person who deploys an artificial intelligence system for use in this state.

(2)  “Developer” means a person who develops an artificial intelligence system that is offered, sold, leased, given, or otherwise provided in this state.

(3)  “Governmental entity” means any department, commission, board, office, authority, or other administrative unit of this state or of any political subdivision of this state, that exercises governmental functions under the authority of the laws of this state.  The term does not include:

(A)  a hospital district created under the Health and Safety Code or Article IX, Texas Constitution; or

(B)  an institution of higher education, as defined by Section 61.003, Education Code, including any university system or any component institution of the system.

Sec. 552.002.  CONSTRUCTION OF CHAPTER.  This chapter may not be construed to:

(1)  impose a requirement on a person that adversely affects the rights or freedoms of any person, including the right of free speech; or

(2)  authorize any department or agency other than the Department of Insurance to regulate or oversee the business of insurance.

Sec. 552.003.  LOCAL PREEMPTION.  This chapter supersedes and preempts any ordinance, resolution, rule, or other regulation adopted by a political subdivision regarding the use of artificial intelligence systems.

SUBCHAPTER B. DUTIES AND PROHIBITIONS ON USE OF ARTIFICIAL INTELLIGENCE

Sec. 552.051.  DISCLOSURE TO CONSUMERS.  (a)  In this section, “health care services” means services related to human health or to the diagnosis, prevention, or treatment of a human disease or impairment provided by an individual licensed, registered, or certified under applicable state or federal law to provide those services.

(b)  A governmental agency that makes available an artificial intelligence system intended to interact with consumers shall disclose to each consumer, before or at the time of interaction, that the consumer is interacting with an artificial intelligence system.

(c)  A person is required to make the disclosure under Subsection (b) regardless of whether it would be obvious to a reasonable consumer that the consumer is interacting with an artificial intelligence system.

(d)  A disclosure under Subsection (b):

(1)  must be clear and conspicuous;

(2)  must be written in plain language; and

(3)  may not use a dark pattern, as that term is defined by Section 541.001.

(e)  A disclosure under Subsection (b) may be provided by using a hyperlink to direct a consumer to a separate Internet web page.

(f)  If an artificial intelligence system is used in relation to health care service or treatment, the provider of the service or treatment shall provide the disclosure under Subsection (b) to the recipient of the service or treatment or the recipient’s personal representative not later than the date the service or treatment is first provided, except in the case of emergency, in which case the provider shall provide the required disclosure as soon as reasonably possible.

Sec. 552.052.  MANIPULATION OF HUMAN BEHAVIOR.  A person may not develop or deploy an artificial intelligence system in a manner that intentionally aims to incite or encourage a person to:

(1)  commit physical self-harm, including suicide;

(2)  harm another person; or

(3)  engage in criminal activity.

Sec. 552.053.  SOCIAL SCORING.  A governmental entity may not use or deploy an artificial intelligence system that evaluates or classifies a natural person or group of natural persons based on social behavior or personal characteristics, whether known, inferred, or predicted, with the intent to calculate or assign a social score or similar categorical estimation or valuation of the person or group of persons that results or may result in:

(1)  detrimental or unfavorable treatment of a person or group of persons in a social context unrelated to the context in which the behavior or characteristics were observed or noted;

(2)  detrimental or unfavorable treatment of a person or group of persons that is unjustified or disproportionate to the nature or gravity of the observed or noted behavior or characteristics; or

(3)  the infringement of any right guaranteed under the United States Constitution, the Texas Constitution, or state or federal law.

Sec. 552.054.  CAPTURE OF BIOMETRIC DATA.  (a)  In this section, “biometric data” means data generated by automatic measurements of an individual’s biological characteristics.  The term includes a fingerprint, voiceprint, eye retina or iris, or other unique biological pattern or characteristic that is used to identify a specific individual.  The term does not include a physical or digital photograph or data generated from a physical or digital photograph, a video or audio recording or data generated from a video or audio recording, or information collected, used, or stored for health care treatment, payment, or operations under the Health Insurance Portability and Accountability Act of 1996 (42 U.S.C. Section 1320d et seq.).

(b)  A governmental entity may not develop or deploy an artificial intelligence system for the purpose of uniquely identifying a specific individual using biometric data or the targeted or untargeted gathering of images or other media from the Internet or any other publicly available source without the individual’s consent, if the gathering would infringe on any right of the individual under the United States Constitution, the Texas Constitution, or state or federal law.

(c)  A violation of Section 503.001 is a violation of this section.

Sec. 552.055.  CONSTITUTIONAL PROTECTION.  (a)  A person may not develop or deploy an artificial intelligence system with the sole intent for the artificial intelligence system to infringe, restrict, or otherwise impair an individual’s rights guaranteed under the United States Constitution.

(b)  This section is remedial in purpose and may not be construed to create or expand any right guaranteed by the United States Constitution.

Sec. 552.056.  UNLAWFUL DISCRIMINATION.  (a)  In this section:

(1)  “Financial institution” has the meaning assigned by Section 201.101, Finance Code.

(2)  “Insurance entity” means:

(A)  an entity described by Section 82.002(a), Insurance Code;

(B)  a fraternal benefit society regulated under Chapter 885, Insurance Code; or

(C)  the developer of an artificial intelligence system used by an entity described by Paragraph (A) or (B).

(3)  “Protected class” means a group or class of persons with a characteristic, quality, belief, or status protected from discrimination by state or federal civil rights laws, and includes race, color, national origin, sex, age, religion, or disability.

(b)  A person may not develop or deploy an artificial intelligence system with the intent to unlawfully discriminate against a protected class in violation of state or federal law.

(c)  For purposes of this section, a disparate impact is not sufficient by itself to demonstrate an intent to discriminate.

(d)  This section does not apply to an insurance entity for purposes of providing insurance services if the entity is subject to applicable statutes regulating unfair discrimination, unfair methods of competition, or unfair or deceptive acts or practices related to the business of insurance.

(e)  A federally insured financial institution is considered to be in compliance with this section if the institution complies with all federal and state banking laws and regulations.

Sec. 552.057.  CERTAIN SEXUALLY EXPLICIT CONTENT AND CHILD PORNOGRAPHY.  A person may not:

(1)  develop or distribute an artificial intelligence system with the sole intent of producing, assisting or aiding in producing, or distributing:

(A)  visual material in violation of Section 43.26, Penal Code; or

(B)  deep fake videos or images in violation of Section 21.165, Penal Code; or

(2)  intentionally develop or distribute an artificial intelligence system that engages in text-based conversations that simulate or describe sexual conduct, as that term is defined by Section 43.25, Penal Code, while impersonating or imitating a child younger than 18 years of age.

SUBCHAPTER C.  ENFORCEMENT

Sec. 552.101.  ENFORCEMENT AUTHORITY.  (a)  The attorney general has exclusive authority to enforce this chapter, except to the extent provided by Section 552.106.

(b)  This chapter does not provide a basis for, and is not subject to, a private right of action for a violation of this chapter or any other law.

Sec. 552.102.  INFORMATION AND COMPLAINTS.  The attorney general shall create and maintain an online mechanism on the attorney general’s Internet website through which a consumer may submit a complaint under this chapter to the attorney general.

Sec. 552.103.  INVESTIGATIVE AUTHORITY.  (a)  If the attorney general receives a complaint through the online mechanism under Section 552.102 alleging a violation of this chapter, the attorney general may issue a civil investigative demand to determine if a violation has occurred.  The attorney general shall issue demands in accordance with and under the procedures established under Section 15.10.

(b)  The attorney general may request from the person reported through the online mechanism, pursuant to a civil investigative demand issued under Subsection (a):

(1)  a high-level description of the purpose, intended use, deployment context, and associated benefits of the artificial intelligence system with which the person is affiliated;

(2)  a description of the type of data used to program or train the artificial intelligence system;

(3)  a high-level description of the categories of data processed as inputs for the artificial intelligence system;

(4)  a high-level description of the outputs produced by the artificial intelligence system;

(5)  any metrics the person uses to evaluate the performance of the artificial intelligence system;

(6)  any known limitations of the artificial intelligence system;

(7)  a high-level description of the post-deployment monitoring and user safeguards the person uses for the artificial intelligence system, including, if the person is a deployer, the oversight, use, and learning process established by the person to address issues arising from the system’s deployment; or

(8)  any other relevant documentation reasonably necessary for the attorney general to conduct an investigation under this section.

Sec. 552.104.  NOTICE OF VIOLATION; OPPORTUNITY TO CURE.  (a)  If the attorney general determines that a person has violated or is violating this chapter, the attorney general shall notify the person in writing of the determination, identifying the specific provisions of this chapter the attorney general alleges have been or are being violated.

(b)  The attorney general may not bring an action against the person:

(1)  before the 60th day after the date the attorney general provides the notice under Subsection (a); or

(2)  if, before the 60th day after the date the attorney general provides the notice under Subsection (a), the person:

(A)  cures the identified violation; and

(B)  provides the attorney general with a written statement that the person has:

(i)  cured the alleged violation;

(ii)  provided supporting documentation to show the manner in which the person cured the violation; and

(iii)  made any necessary changes to internal policies to reasonably prevent further violation of this chapter.

Sec. 552.105.  CIVIL PENALTY; INJUNCTION.  (a)  A person who violates this chapter and does not cure the violation under Section 552.104 is liable to this state for a civil penalty in an amount of:

(1)  for each violation the court determines to be curable or a breach of a statement submitted to the attorney general under Section 552.104(b)(2), not less than $10,000 and not more than $12,000;

(2)  for each violation the court determines to be uncurable, not less than $80,000 and not more than $200,000; and

(3)  for a continued violation, not less than $2,000 and not more than $40,000 for each day the violation continues.

(b)  The attorney general may bring an action in the name of this state to:

(1)  collect a civil penalty under this section;

(2)  seek injunctive relief against further violation of this chapter; and

(3)  recover attorney’s fees and reasonable court costs or other investigative expenses.

(c)  There is a rebuttable presumption that a person used reasonable care as required under this chapter.

(d)  A defendant in an action under this section may seek an expedited hearing or other process, including a request for declaratory judgment, if the person believes in good faith that the person has not violated this chapter.

(e)  A defendant in an action under this section may not be found liable if:

(1)  another person uses the artificial intelligence system affiliated with the defendant in a manner prohibited by this chapter; or

(2)  the defendant discovers a violation of this chapter through:

(A)  feedback from a developer, deployer, or other person who believes a violation has occurred;

(B)  testing, including adversarial testing or red-team testing;

(C)  following guidelines set by applicable state agencies; or

(D)  if the defendant substantially complies with the most recent version of the “Artificial Intelligence Risk Management Framework: Generative Artificial Intelligence Profile” published by the National Institute of Standards and Technology or another nationally or internationally recognized risk management framework for artificial intelligence systems, an internal review process.

(f)  The attorney general may not bring an action to collect a civil penalty under this section against a person for an artificial intelligence system that has not been deployed.

Sec. 552.106.  ENFORCEMENT ACTIONS BY STATE AGENCIES.  (a)  A state agency may impose sanctions against a person licensed, registered, or certified by that agency for a violation of Subchapter B if:

(1)  the person has been found in violation of this chapter under Section 552.105; and

(2)  the attorney general has recommended additional enforcement by the applicable agency.

(b)  Sanctions under this section may include:

(1)  suspension, probation, or revocation of a license, registration, certificate, or other authorization to engage in an activity; and

(2)  a monetary penalty not to exceed $100,000.

CHAPTER 553.  ARTIFICIAL INTELLIGENCE REGULATORY SANDBOX PROGRAM

SUBCHAPTER A.  GENERAL PROVISIONS

Sec. 553.001.  DEFINITIONS.  In this chapter:

(1)  “Applicable agency” means a department of this state established by law to regulate certain types of business activity in this state and the people engaging in that business, including the issuance of licenses and registrations, that the department determines would regulate a program participant if the person were not operating under this chapter.

(2)  “Department” means the Texas Department of Information Resources.

(3)  “Program” means the regulatory sandbox program established under this chapter that allows a person, without being licensed or registered under the laws of this state, to test an artificial intelligence system for a limited time and on a limited basis.

(4)  “Program participant” means a person whose application to participate in the program is approved and who may test an artificial intelligence system under this chapter.

SUBCHAPTER B.  SANDBOX PROGRAM FRAMEWORK

Sec. 553.051.  ESTABLISHMENT OF SANDBOX PROGRAM.  (a)  The department, in consultation with the council, shall create a regulatory sandbox program that enables a person to obtain legal protection and limited access to the market in this state to test innovative artificial intelligence systems without obtaining a license, registration, or other regulatory authorization.

(b)  The program is designed to:

(1)  promote the safe and innovative use of artificial intelligence systems across various sectors including healthcare, finance, education, and public services;

(2)  encourage responsible deployment of artificial intelligence systems while balancing the need for consumer protection, privacy, and public safety;

(3)  provide clear guidelines for a person who develops an artificial intelligence system to test systems while certain laws and regulations related to the testing are waived or suspended; and

(4)  allow a person to engage in research, training, testing, or other pre-deployment activities to develop an artificial intelligence system.

(c)  The attorney general may not file or pursue charges against a program participant for violation of a law or regulation waived under this chapter that occurs during the testing period.

(d)  A state agency may not file or pursue punitive action against a program participant, including the imposition of a fine or the suspension or revocation of a license, registration, or other authorization, for violation of a law or regulation waived under this chapter that occurs during the testing period.

(e)  Notwithstanding Subsections (c) and (d), the requirements of Subchapter B, Chapter 552, may not be waived, and the attorney general or a state agency may file or pursue charges or action against a program participant who violates that subchapter.

Sec. 553.052.  APPLICATION FOR PROGRAM PARTICIPATION.  (a)  A person must obtain approval from the department and any applicable agency before testing an artificial intelligence system under the program.

(b)  The department by rule shall prescribe the application form.  The form must require the applicant to:

(1)  provide a detailed description of the artificial intelligence system the applicant desires to test in the program, and its intended use;

(2)  include a benefit assessment that addresses potential impacts on consumers, privacy, and public safety;

(3)  describe the applicant’s plan for mitigating any adverse consequences that may occur during the test; and

(4)  provide proof of compliance with any applicable federal artificial intelligence laws and regulations.

Sec. 553.053.  DURATION AND SCOPE OF PARTICIPATION.  (a)  A program participant approved by the department and each applicable agency may test and deploy an artificial intelligence system under the program for a period of not more than 36 months.

(b)  The department may extend a test under this chapter if the department finds good cause for the test to continue.

Sec. 553.054.  EFFICIENT USE OF RESOURCES.  The department shall coordinate the activities under this subchapter and any other law relating to artificial intelligence systems to ensure efficient system implementation and to streamline the use of department resources, including information sharing and personnel.

SUBCHAPTER C.  OVERSIGHT AND COMPLIANCE

Sec. 553.101.  COORDINATION WITH APPLICABLE AGENCY.  (a)  The department shall coordinate with all applicable agencies to oversee the operation of a program participant.

(b)  The council or an applicable agency may recommend to the department that a program participant be removed from the program if the council or applicable agency finds that the program participant’s artificial intelligence system:

(1)  poses an undue risk to public safety or welfare;

(2)  violates any federal law or regulation; or

(3)  violates any state law or regulation not waived under the program.

Sec. 553.102.  PERIODIC REPORT BY PROGRAM PARTICIPANT.  (a)  A program participant shall provide a quarterly report to the department.

(b)  The report shall include:

(1)  metrics for the artificial intelligence system’s performance;

(2)  updates on how the artificial intelligence system mitigates any risks associated with its operation; and

(3)  feedback from consumers and affected stakeholders that are using an artificial intelligence system tested under this chapter.

(c)  The department shall maintain confidentiality regarding the intellectual property, trade secrets, and other sensitive information it obtains through the program.

Sec. 553.103.  ANNUAL REPORT BY DEPARTMENT.  (a)  The department shall submit an annual report to the legislature.

(b)  The report shall include:

(1)  the number of program participants testing an artificial intelligence system in the program;

(2)  the overall performance and impact of artificial intelligence systems tested in the program; and

(3)  recommendations on changes to laws or regulations for future legislative consideration.

CHAPTER 554.  TEXAS ARTIFICIAL INTELLIGENCE COUNCIL

SUBCHAPTER A.  CREATION AND ORGANIZATION OF COUNCIL

Sec. 554.001.  CREATION OF COUNCIL.  (a)  The Texas Artificial Intelligence Council is created to:

(1)  ensure artificial intelligence systems in this state are ethical and developed in the public’s best interest;

(2)  ensure artificial intelligence systems in this state do not harm public safety or undermine individual freedoms by finding issues and making recommendations to the legislature regarding the Penal Code and Chapter 82, Civil Practice and Remedies Code;

(3)  identify existing laws and regulations that impede innovation in the development of artificial intelligence systems and recommend appropriate reforms;

(4)  analyze opportunities to improve the efficiency and effectiveness of state government operations through the use of artificial intelligence systems;

(5)  make recommendations to applicable state agencies regarding the use of artificial intelligence systems to improve the agencies’ efficiency and effectiveness;

(6)  evaluate potential instances of regulatory capture, including undue influence by technology companies or disproportionate burdens on smaller innovators caused by the use of artificial intelligence systems;

(7)  evaluate the influence of technology companies on other companies and determine the existence or use of tools or processes designed to censor competitors or users through the use of artificial intelligence systems;

(8)  offer guidance and recommendations to the legislature on the ethical and legal use of artificial intelligence systems;

(9)  conduct and publish the results of a study on the current regulatory environment for artificial intelligence systems;

(10)  receive reports from the Department of Information Resources regarding the regulatory sandbox program under Chapter 553; and

(11)  make recommendations for improvements to the regulatory sandbox program under Chapter 553.

(b)  The council is administratively attached to the Department of Information Resources, and the department shall provide administrative support to the council as provided by this section.

(c)  The Department of Information Resources and the council shall enter into a memorandum of understanding detailing:

(1)  the administrative support the council requires from the department to fulfill the council’s purposes;

(2)  the reimbursement of administrative expenses to the department; and

(3)  any other provisions necessary to ensure the efficient operation of the council.

Sec. 554.002.  COUNCIL MEMBERSHIP.  (a)  The council is composed of seven members as follows:

(1)  three members of the public appointed by the governor;

(2)  two members of the public appointed by the lieutenant governor; and

(3)  two members of the public appointed by the speaker of the house of representatives.

(b)  Members of the council serve staggered four-year terms, with the terms of three or four members expiring every two years.

(c)  The governor shall appoint a chair from among the members, and the council shall elect a vice chair from its membership.

(d)  The council may establish an advisory board composed of individuals from the public who possess expertise directly related to the council’s functions, including technical, ethical, regulatory, and other relevant areas.

Sec. 554.003.  QUALIFICATIONS.  Members of the council must be Texas residents and have knowledge or expertise in one or more of the following areas:

(1)  artificial intelligence systems;

(2)  data privacy and security;

(3)  ethics in technology or law;

(4)  public policy and regulation;

(5)  risk management related to artificial intelligence systems;

(6)  improving the efficiency and effectiveness of governmental operations; or

(7)  anticompetitive practices and market fairness.

Sec. 554.004.  STAFF AND ADMINISTRATION.  The council may hire an executive director and other personnel as necessary to perform its duties.

SUBCHAPTER B.  POWERS AND DUTIES OF COUNCIL

Sec. 554.101.  ISSUANCE OF REPORTS.  (a)  The council may issue reports to the legislature regarding the use of artificial intelligence systems in this state.

(b)  The council may issue reports on:

(1)  the compliance of artificial intelligence systems in this state with the laws of this state;

(2)  the ethical implications of deploying artificial intelligence systems in this state;

(3)  data privacy and security concerns related to artificial intelligence systems in this state; or

(4)  potential liability or legal risks associated with the use of artificial intelligence systems in this state.

Sec. 554.102.  TRAINING AND EDUCATIONAL OUTREACH.  The council shall conduct training programs for state agencies and local governments on the use of artificial intelligence systems.

Sec. 554.103.  LIMITATION OF AUTHORITY.  The council may not:

(1)  adopt rules or promulgate guidance that is binding for any entity;

(2)  interfere with or override the operation of a state agency; or

(3)  perform a duty or exercise a power not granted by this chapter.

SECTION 5.  Section 325.011, Government Code, is amended to read as follows:

Sec. 325.011.  CRITERIA FOR REVIEW.  The commission and its staff shall consider the following criteria in determining whether a public need exists for the continuation of a state agency or its advisory committees or for the performance of the functions of the agency or its advisory committees:

(1)  the efficiency and effectiveness with which the agency or the advisory committee operates;

(2)(A)  an identification of the mission, goals, and objectives intended for the agency or advisory committee and of the problem or need that the agency or advisory committee was intended to address; and

(B)  the extent to which the mission, goals, and objectives have been achieved and the problem or need has been addressed;

(3)(A)  an identification of any activities of the agency in addition to those granted by statute and of the authority for those activities; and

(B)  the extent to which those activities are needed;

(4)  an assessment of authority of the agency relating to fees, inspections, enforcement, and penalties;

(5)  whether less restrictive or alternative methods of performing any function that the agency performs could adequately protect or provide service to the public;

(6)  the extent to which the jurisdiction of the agency and the programs administered by the agency overlap or duplicate those of other agencies, the extent to which the agency coordinates with those agencies, and the extent to which the programs administered by the agency can be consolidated with the programs of other state agencies;

(7)  the promptness and effectiveness with which the agency addresses complaints concerning entities or other persons affected by the agency, including an assessment of the agency’s administrative hearings process;

(8)  an assessment of the agency’s rulemaking process and the extent to which the agency has encouraged participation by the public in making its rules and decisions and the extent to which the public participation has resulted in rules that benefit the public;

(9)  the extent to which the agency has complied with:

(A)  federal and state laws and applicable rules regarding equality of employment opportunity and the rights and privacy of individuals; and

(B)  state law and applicable rules of any state agency regarding purchasing guidelines and programs for historically underutilized businesses;

(10)  the extent to which the agency issues and enforces rules relating to potential conflicts of interest of its employees;

(11)  the extent to which the agency complies with Chapters 551 and 552 and follows records management practices that enable the agency to respond efficiently to requests for public information;

(12)  the effect of federal intervention or loss of federal funds if the agency is abolished;

(13)  the extent to which the purpose and effectiveness of reporting requirements imposed on the agency justifies the continuation of the requirement; [and]

(14)  an assessment of the agency’s cybersecurity practices using confidential information available from the Department of Information Resources or any other appropriate state agency; and

(15)  an assessment of the agency’s use of artificial intelligence systems, as that term is defined by Section 551.001, Business & Commerce Code, in its operations and its oversight of the use of artificial intelligence systems by persons under the agency’s jurisdiction, and any related impact on the agency’s ability to achieve its mission, goals, and objectives, made using information available from the Department of Information Resources, the attorney general, or any other appropriate state agency.

SECTION 6.  Section 2054.068(b), Government Code, is amended to read as follows:

(b)  The department shall collect from each state agency information on the status and condition of the agency’s information technology infrastructure, including information regarding:

(1)  the agency’s information security program;

(2)  an inventory of the agency’s servers, mainframes, cloud services, and other information technology equipment;

(3)  identification of vendors that operate and manage the agency’s information technology infrastructure; [and]

(4)  any additional related information requested by the department; and

(5)  an evaluation of the use or considered use of artificial intelligence systems, as defined by Section 551.001, Business & Commerce Code, by each state agency.

SECTION 7.  Section 2054.0965(b), Government Code, is amended to read as follows:

(b)  Except as otherwise modified by rules adopted by the department, the review must include:

(1)  an inventory of the agency’s major information systems, as defined by Section 2054.008, and other operational or logistical components related to deployment of information resources as prescribed by the department;

(2)  an inventory of the agency’s major databases, artificial intelligence systems, as defined by Section 551.001, Business & Commerce Code, and applications;

(3)  a description of the agency’s existing and planned telecommunications network configuration;

(4)  an analysis of how information systems, components, databases, applications, and other information resources have been deployed by the agency in support of:

(A)  applicable achievement goals established under Section 2056.006 and the state strategic plan adopted under Section 2056.009;

(B)  the state strategic plan for information resources; and

(C)  the agency’s business objectives, mission, and goals;

(5)  agency information necessary to support the state goals for interoperability and reuse; and

(6)  confirmation by the agency of compliance with state statutes, rules, and standards relating to information resources.

SECTION 8.  Not later than September 1, 2026, the attorney general shall post on the attorney general’s Internet website the information and online mechanism required by Section 552.102, Business & Commerce Code, as added by this Act.

SECTION 9.  (a)  Notwithstanding any other section of this Act, in a state fiscal year, a state agency to which this Act applies is not required to implement a provision found in another section of this Act that is drafted as a mandatory provision imposing a duty on the agency to take an action unless money is specifically appropriated to the agency for that fiscal year to carry out that duty.  The agency may implement the provision in that fiscal year to the extent other funding is available to the agency to do so.

(b)  If, as authorized by Subsection (a) of this section, the state agency does not implement the mandatory provision in a state fiscal year, the state agency, in its legislative budget request for the next state fiscal biennium, shall certify that fact to the Legislative Budget Board and include a written estimate of the costs of implementing the provision in each year of that next state fiscal biennium.

SECTION 10.  This Act takes effect January 1, 2026.

    President of the Senate           Speaker of the House      

I certify that H.B. No. 149 was passed by the House on April 23, 2025, by the following vote:  Yeas 146, Nays 3, 1 present, not voting; and that the House concurred in Senate amendments to H.B. No. 149 on May 30, 2025, by the following vote:  Yeas 121, Nays 17, 2 present, not voting.

______________________________

Chief Clerk of the House   

I certify that H.B. No. 149 was passed by the Senate, with amendments, on May 23, 2025, by the following vote:  Yeas 31, Nays 0.

______________________________

Secretary of the Senate   

APPROVED: __________________

                 Date       

          __________________

               Governor       


Appendix 2 — Model Ordinance: Responsible Use of Artificial Intelligence in City Operations

ORDINANCE NO. ______

AN ORDINANCE

relating to the responsible use of artificial intelligence systems by the City; establishing transparency, accountability, and oversight requirements; and providing for implementation and administration.

WHEREAS,

the City recognizes that artificial intelligence (“AI”) systems are increasingly used to improve operational efficiency, service delivery, data analysis, and internal workflows; and

WHEREAS,

the City further recognizes that certain uses of AI may influence decisions affecting residents, employees, vendors, or regulated parties and therefore require appropriate oversight; and

WHEREAS,

the City seeks to encourage responsible innovation while preserving public trust, transparency, and accountability; and

WHEREAS,

the Texas Legislature has enacted the Texas Responsible Artificial Intelligence Governance Act, effective January 1, 2026, establishing statewide standards for AI use by government entities; and

WHEREAS,

the City recognizes that the adoption of artificial intelligence tools may, over time, change how work is performed and how staffing needs are structured, and that any such impacts are expected to occur gradually through attrition, reassignment, or role redesign rather than immediate workforce reductions;

NOW, THEREFORE, BE IT ORDAINED BY THE CITY COUNCIL OF THE CITY OF __________, TEXAS:

Section 1. Definitions

For purposes of this Ordinance:

  1. “Artificial Intelligence System” means a computational system that uses machine learning, statistical modeling, or related techniques to perform tasks normally associated with human intelligence, including analysis, prediction, classification, content generation, or prioritization.
  2. “Decision-Adjacent AI” means an AI system that materially influences, prioritizes, or recommends outcomes related to enforcement, eligibility, allocation of resources, personnel actions, procurement decisions, or public services, even if final decisions are made by a human.
  3. “High-Risk AI Use” means deployment of an AI system that directly or indirectly affects individual rights, access to services, enforcement actions, or legally protected interests.
  4. “Department” means any City department, office, division, or agency.

Section 2. Permitted Use of Artificial Intelligence

(a) Internal Productivity Uses. Departments may deploy AI systems for internal productivity and analytical purposes, including but not limited to:

  • Drafting and summarization of documents
  • Data analysis and forecasting
  • Workflow automation
  • Research and internal reporting
  • Customer-service chat tools providing general information (with disclaimers as appropriate)

Such uses shall not require prior Council approval but shall be subject to internal documentation requirements.

(b) Decision-Adjacent Uses. AI systems that influence or support decisions affecting residents, employees, vendors, or regulated entities may be deployed only in accordance with Sections 3 and 4 of this Ordinance.

Section 3. Prohibited Uses

No Department shall deploy or use an AI system that:

  1. Performs social scoring of individuals or groups based on behavior, personal traits, or reputation for the purpose of denying services, benefits, or rights;
  2. Intentionally discriminates against a protected class in violation of state or federal law;
  3. Generates or deploys biometric identification or surveillance in violation of constitutional protections;
  4. Produces or facilitates unlawful deep-fake or deceptive content;
  5. Operates as a fully automated decision-making system without meaningful human review in matters affecting legal rights or obligations.

Section 4. Oversight and Approval for High-Risk AI Uses

(a) Inventory Requirement. The City Manager shall maintain a centralized AI Systems Inventory identifying:

  • Each AI system in use
  • The Department deploying the system
  • The system’s purpose
  • Whether the use is classified as high-risk

(b) Approval Process. Prior to deployment of any High-Risk AI Use, the Department must:

  1. Submit a written justification describing the system’s purpose and scope;
  2. Identify the data sources used by the system;
  3. Describe human oversight mechanisms;
  4. Obtain approval from:
    • The City Manager (or designee), and
    • The City Attorney for legal compliance review.

(c) Human Accountability. Each AI system shall have a designated human owner responsible for:

  • Monitoring performance
  • Responding to errors or complaints
  • Suspending use if risks are identified

Section 5. Transparency and Public Disclosure

(a) Disclosure to the Public. When a City AI system interacts directly with residents, the City shall provide clear notice that the interaction involves AI.

(b) Public Reporting. The City shall publish annually:

  • A summary of AI systems in use
  • The general purposes of high-risk AI systems
  • Contact information for public inquiries

No proprietary or security-sensitive information shall be disclosed.

Section 6. Procurement and Vendor Requirements

All City contracts involving AI systems shall, where applicable:

  1. Require disclosure of AI functions;
  2. Prohibit undisclosed algorithmic decision-making;
  3. Allow the City to audit or review AI system outputs relevant to City operations;
  4. Require vendors to notify the City of material changes to AI functionality.

Section 7. Review and Sunset

(a) Periodic Review. High-risk AI systems shall be reviewed at least annually to assess:

  • Accuracy
  • Bias
  • Continued necessity
  • Compliance with this Ordinance

(b) Sunset Authority. The City Manager may suspend or terminate use of any AI system that poses unacceptable risk or fails compliance review.

Section 8. Training

The City shall provide appropriate training to employees involved in:

  • Deploying AI systems
  • Supervising AI-assisted workflows
  • Interpreting AI-generated outputs

Section 9. Severability

If any provision of this Ordinance is held invalid, such invalidity shall not affect the remaining provisions.

Section 10. Effective Date

This Ordinance shall take effect immediately upon adoption.


Appendix 3 — City Manager Administrative Regulation: Responsible Use of Artificial Intelligence

ADMINISTRATIVE REGULATION NO. ___

Subject: Responsible Use of Artificial Intelligence (AI) in City Operations
Authority: Ordinance No. ___ (Responsible Use of Artificial Intelligence)
Issued by: City Manager
Effective Date: __________

1. Purpose

This Administrative Regulation establishes operational procedures for the responsible deployment, oversight, and monitoring of artificial intelligence (AI) systems used by the City, consistent with adopted Council policy and applicable state law.

The intent is to:

  • Enable rapid adoption of AI for productivity and service delivery;
  • Ensure transparency and accountability for higher-risk uses; and
  • Protect the City, employees, and residents from unintended consequences.

2. Scope

This regulation applies to all City departments, offices, and divisions that:

  • Develop, procure, deploy, or use AI systems; or
  • Rely on vendor-provided software that includes AI functionality.

3. AI System Classification

Departments shall classify AI systems into one of the following categories:

A. Tier 1 — Internal Productivity AI

Examples:

  • Document drafting and summarization
  • Data analysis and forecasting
  • Internal research and reporting
  • Workflow automation

Oversight Level:

  • Department-level approval
  • Registration in AI Inventory

B. Tier 2 — Decision-Adjacent AI

Examples:

  • Permit or inspection prioritization
  • Vendor or application risk scoring
  • Resource allocation recommendations
  • Enforcement or compliance triage

Oversight Level:

  • City Manager approval
  • Legal review
  • Annual performance review

C. Tier 3 — High-Risk AI

Examples:

  • AI influencing enforcement actions
  • Eligibility determinations
  • Public safety analytics
  • Biometric or surveillance tools

Oversight Level:

  • City Manager approval
  • City Attorney review
  • Documented human-in-the-loop controls
  • Annual audit and Council notification

4. AI Systems Inventory

The City Manager’s Office shall maintain a centralized AI Systems Inventory, which includes:

  • System name and vendor
  • Department owner
  • Purpose and classification tier
  • Date of deployment
  • Oversight requirements

Departments shall update the inventory prior to deploying any new AI system.

5. Approval Process

A. Tier 1 Systems

  • Approved by Department Director
  • Registered in inventory

B. Tier 2 and Tier 3 Systems

Departments must submit:

  1. A description of the system and intended use
  2. Data sources and inputs
  3. Description of human oversight
  4. Risk mitigation measures

Approval required from:

  • City Manager (or designee)
  • City Attorney (for legal compliance)

6. Human Oversight & Accountability

Each AI system shall have a designated System Owner responsible for:

  • Monitoring system outputs
  • Responding to errors or complaints
  • Suspending use if risks emerge
  • Coordinating audits or reviews

No AI system may operate as a fully autonomous decision-maker for actions affecting legal rights or obligations.

7. Vendor & Procurement Controls

Procurement involving AI systems shall:

  • Identify AI functionality explicitly in solicitations
  • Require vendors to disclose material AI updates
  • Prohibit undisclosed algorithmic decision-making
  • Preserve City audit and review rights

8. Monitoring, Review & Sunset

  • Tier 2 and Tier 3 systems shall undergo annual review.
  • Systems may be suspended or sunset if:
    • Accuracy degrades
    • Bias is identified
    • Legal risk increases
    • The system no longer serves a defined purpose

9. Training

Departments deploying AI shall ensure appropriate staff training covering:

  • Proper interpretation of AI outputs
  • Limitations of AI systems
  • Escalation and error-handling procedures

10. Reporting to Council

The City Manager shall provide Council with:

  • An annual summary of AI systems in use
  • Identification of Tier 3 (High-Risk) systems
  • Any material incidents or corrective actions

11. Effective Date

This Administrative Regulation is effective immediately upon issuance.

12. Workforce Considerations

The use of artificial intelligence systems may change job functions and workflows over time. Departments shall:

  • Use AI to augment employee capabilities wherever possible;
  • Prioritize retraining, reassignment, and natural attrition when workflows change;
  • Coordinate with Human Resources before deploying AI systems that materially alter job duties; and
  • Recognize that long-term staffing impacts, if any, remain subject to City Manager and City Council authority.

Appendix 4 — Public-Facing FAQ: Responsible Use of Artificial Intelligence in City Operations

What is this ordinance about?

This ordinance establishes clear rules for how the City may use artificial intelligence (AI) tools. It allows the City to use modern technology to improve efficiency and service delivery while ensuring that higher-risk uses are transparent, accountable, and overseen by people.

Is the City already using artificial intelligence?

Yes. Like most modern organizations, the City already uses limited AI-enabled tools for tasks such as document drafting, data analysis, customer service support, and vendor-provided software systems.

This ordinance ensures those tools are used consistently and responsibly.

Is this ordinance banning artificial intelligence?

No.
The ordinance does not ban AI. It encourages responsible adoption of AI for productivity and internal efficiency while placing guardrails on uses that could affect people’s rights or access to services.

Why is the City adopting rules now?

AI tools are becoming more common and more capable. Clear rules help ensure:

  • Transparency in how AI is used
  • Accountability for outcomes
  • Compliance with new Texas law
  • Public trust in City operations

The Texas Legislature recently enacted statewide standards for AI use by government entities, and this ordinance aligns the City with those expectations.

Will artificial intelligence affect City jobs?

AI may change how work is done over time, just as previous technologies have.

This ordinance does not authorize immediate workforce reductions. Any long-term impacts are expected to occur gradually and, where possible, through:

  • Natural attrition
  • Reassignment
  • Retraining
  • Changes in job duties

Final staffing decisions remain with City leadership and City Council.

Will AI replace City employees?

AI tools are intended to assist employees, not replace human judgment. For higher-risk uses, the ordinance requires meaningful human oversight and accountability.

Can AI make decisions about me automatically?

No.
The ordinance prohibits fully automated decision-making that affects legal rights, enforcement actions, or access to services without human review.

AI may provide information or recommendations, but people remain responsible for decisions.

Will the City use AI for surveillance or facial recognition?

The ordinance prohibits AI uses that violate constitutional protections, including improper biometric surveillance.

Any use of biometric or surveillance-related AI would require strict legal review and compliance with state and federal law.

How will I know if I’m interacting with AI?

If the City uses AI systems that interact directly with residents, the City must clearly disclose that you are interacting with an AI system.

Does this apply to police or public safety?

Yes.
AI tools used in public safety contexts are considered higher-risk and require additional review, approval, and oversight. AI systems may not independently make enforcement decisions.

Who is responsible if an AI system makes a mistake?

Each AI system has a designated City employee responsible for monitoring its use, addressing errors, and suspending the system if necessary.

Responsibility remains with the City—not the software.

Will the public be able to see how AI is used?

Yes.
The City will publish an annual summary describing:

  • The types of AI systems in use
  • Their general purpose
  • How residents can ask questions or raise concerns

Sensitive or proprietary information will not be disclosed.

Does this create a new board or bureaucracy?

No.
Oversight is handled through existing City leadership and administrative structures.

Is there a cost to adopting this ordinance?

There is no direct cost associated with adoption. Over time, responsible AI use may help control costs by improving productivity and efficiency.

How often will this policy be reviewed?

Higher-risk AI systems are reviewed annually. The ordinance itself may be updated as technology and law evolve.

Who can I contact with questions or concerns?

Residents may contact the City Manager’s Office or submit inquiries through the City’s website. Information on AI use and reporting channels will be publicly available.

Bottom Line

This ordinance ensures the City:

  • Uses modern tools responsibly
  • Maintains human accountability
  • Protects public trust
  • Aligns with Texas law
  • Adapts thoughtfully to technological change

An Update on Drone Uses in Texas Municipalities

A second collaboration between Lewis McLain & AI

From Tactical Tools to a Quiet Redefinition of First Response

A decade ago, a municipal drone program in Texas usually meant a small team, a locked cabinet, and a handful of specially trained officers who were called out when circumstances justified it. The drone was an accessory—useful, sometimes impressive, but peripheral to the ordinary rhythm of public safety.

That is no longer the case.

Across Texas, drones are being absorbed into the daily mechanics of emergency response. In a growing number of cities, they are no longer something an officer brings to a scene. They are something the city sends—often before the first patrol car, engine, or ambulance has cleared an intersection.

This shift is subtle, technical, and easily misunderstood. But it represents one of the most consequential changes in municipal public safety design in a generation.


The quiet shift from tools to systems

The defining change is not better cameras or longer flight times. It is program design.

Early drone programs were built around people: pilots, certifications, and equipment checklists. Today’s programs are built around systems—launch infrastructure, dispatch logic, real-time command centers, and policies that define when a drone may be used and, just as importantly, when it may not.

Cities like Arlington illustrate this evolution clearly. Arlington’s drones are not stored in trunks or deployed opportunistically. They launch from fixed docking stations, controlled through the city’s real-time operations center, and are sent to calls the way any other responder would be. The drone’s role is not to replace officers, but to give them something they rarely had before arrival: certainty.

Is someone actually inside the building? Is the suspect still there? Is the person lying in the roadway injured or already moving? These are small questions, but they shape everything that follows. In many cases, the presence of a drone overhead resolves a situation before physical contact ever occurs.

That pattern—early information reducing risk—is now being repeated, in different forms, across the state.


North Texas as an early laboratory

In North Texas, the progression from experimentation to normalization is especially visible.

Arlington’s program has become a reference point, not because it is flashy, but because it works. Drones are treated as routine assets, subject to policy, supervision, and after-action review. Their value is measured in response times and avoided escalations, not in flight hours.

Nearby, Dallas is navigating a more complex path. Dallas already operates one of the most active municipal drone programs in the state, but scale changes everything. Dense neighborhoods, layered airspace, multiple airports, and heightened civil-liberties scrutiny mean that Dallas cannot simply replicate what smaller cities have done.

Instead, Dallas appears to be doing something more consequential: deliberately embedding “Drone as First Responder” capability into its broader public-safety technology framework. Procurement language and public statements now describe drones verifying caller information while officers respond—a quiet but important acknowledgement that drones are becoming part of the dispatch process itself. If Dallas succeeds, it will establish a model for large, complex cities that have so far watched DFR from a distance.

Smaller cities have moved faster.

Prosper, for example, has embraced automation as a way to overcome limited staffing and long travel distances. Its program emphasizes speed—sub-two-minute arrivals made possible by automated docking stations that handle charging and readiness without human intervention. Prosper’s experience suggests that cities do not have to grow into DFR gradually; some can leap directly to system-level deployment.

Cities like Euless represent another important strand of adoption. Their programs are smaller, more cautious, and intentionally bounded. They launch drones to specific call types, collect experience, and adjust policy as they go. These cities matter because they demonstrate how DFR spreads laterally, city by city, through observation and imitation rather than mandates or statewide directives.


South Texas and the widening geography of DFR

DFR is not a North Texas phenomenon.

In the Rio Grande Valley, Edinburg has publicly embraced dispatch-driven drone response for crashes, crimes in progress, and search-and-rescue missions, including night operations using thermal imaging. In regions where heat, terrain, and distance complicate traditional response, the value of rapid aerial awareness is obvious.

Further west, Laredo has framed drones as part of a broader rapid-response network rather than a narrow policing tool. Discussions there extend beyond observation to include overdose response and medical support, pointing toward a future where drones do more than watch—they enable intervention while ground units close the gap.

Meanwhile, cities like Pearland have quietly done the hardest work of all: making DFR ordinary. Pearland’s early focus on remote operations and program governance is frequently cited by other cities, even when it draws little public attention. Its lesson is simple but powerful: the more boring a drone program becomes, the more likely it is to scale.


What 2026 will likely bring

By 2026, Texas municipalities will no longer debate drones in abstract terms. The conversation will shift to coverage, performance, and restraint.

City leaders will ask how much of their jurisdiction can be reached within two or three minutes, and what it costs to achieve that standard. DFR coverage maps will begin to resemble fire-station service areas, and response-time percentiles will replace anecdotal success stories.

Dispatch ownership will matter more than pilot skill. The most successful programs will be those in which drones are managed as part of the call-taking and response ecosystem, not as specialty assets waiting for permission. Pilots will become supervisors of systems, not just operators of aircraft.

At the same time, privacy will increasingly determine the pace of expansion. Cities that define limits early—what drones will never be used for, how long video is kept, who can access it—will move faster and with less friction. Those that delay these conversations will find themselves stalled, not by technology, but by public distrust.

Federal airspace rules will continue to separate tactical programs from scalable ones. Dense metro areas will demand more sophisticated solutions—automated docks, detect-and-avoid capabilities, and carefully designed flight corridors. The cities that solve these problems will not just have better drones; they will have better systems.

And perhaps most telling of all, drones will gradually fade from public conversation. When residents stop noticing them—when a drone overhead is no more remarkable than a patrol car passing by—the transformation will be complete.


A closing thought

Texas cities are not adopting drones because they are fashionable or futuristic. They are doing so because time matters, uncertainty creates risk, and early information saves lives—sometimes by prompting action, and sometimes by preventing it.

By 2026, the question will not be whether drones belong in municipal public safety. It will be why any city, given the chance to act earlier and safer, would choose not to.


Looking Ahead to 2026: When Drones Become Ordinary

By 2026, the most telling sign of success for municipal drone programs in Texas will not be innovation, expansion, or even capability. It will be normalcy.

The early years of public-safety drones were marked by novelty. A drone launch drew attention, generated headlines, and often triggered anxiety about surveillance or overreach. That phase is already fading. What is emerging in its place is quieter and far more consequential: drones becoming an assumed part of the response environment, much like radios, body cameras, or computer-aided dispatch systems once did.

The conversation will no longer revolve around whether a city has drones. Instead, it will focus on coverage and performance. City leaders will ask how quickly aerial eyes can reach different parts of the city, how often drones arrive before ground units, and what percentage of priority calls benefit from early visual confirmation. Response-time charts and service-area maps will replace anecdotes and demonstrations. In this sense, drones will stop being treated as technology and start being treated as infrastructure.

This shift will also clarify responsibility. The most mature programs will no longer center on individual pilots or specialty units. Ownership will move decisively toward dispatch and real-time operations centers. Drones will be launched because a call meets predefined criteria, not because someone happens to be available or enthusiastic. Pilots will increasingly function as system supervisors, ensuring compliance, safety, and continuity, rather than as hands-on operators for every flight.

At the same time, restraint will become just as important as reach. Cities that succeed will be those that articulate, early and clearly, what drones are not for. By 2026, residents will expect drone programs to come with explicit boundaries: no routine patrols, no generalized surveillance, no silent expansion of mission. Programs that fail to define those limits will find themselves stalled, regardless of how capable the technology may be.

Federal airspace rules and urban complexity will further separate casual programs from durable ones. Large cities will discover that scaling drones is less about buying more aircraft and more about solving coordination problems—airspace, redundancy, automation, and integration with other systems. The cities that work through those constraints will not just fly more often; they will fly predictably and defensibly.

And then, gradually, the attention will drift away.

When a drone arriving overhead is no longer remarkable—when it is simply understood as one of the first tools a city sends to make sense of an uncertain situation—the transition will be complete. The public will not notice drones because they will no longer symbolize change. They will symbolize continuity.

That is the destination Texas municipalities are approaching: not a future where drones dominate public safety, but one where they quietly support it—reducing uncertainty, improving judgment, and often preventing escalation precisely because they arrive early and ask the simplest question first: What is really happening here?

By 2026, the most advanced drone programs in Texas will not feel futuristic at all. They will feel inevitable.

How Could the Minnesota Fraud Happen — and Why Texas Didn’t See the Same Outcome

A collaboration between Lewis McLain & AI

The recent revelation that federal prosecutors believe up to half of roughly $18 billion in federal funds administered through Minnesota programs may have been fraudulently claimed has raised a deeper and more troubling question than simple criminal wrongdoing. The central issue is not whether fraud occurred — it clearly did — but how such a vast scheme could persist for years without decisive intervention, and why similar failures did not reach the same scale in other states, particularly Texas.

Answering that question requires stepping away from partisan framing and examining program design, administrative architecture, timing of awareness, and institutional decision-making.


I. The Nature of the Programs Involved

Most of the funds at issue flowed through federally funded, state-administered social service programs, including:

  • Child nutrition programs
  • Medicaid-related services (including autism therapy and home-based supports)
  • Housing and disability assistance

These programs share several structural features:

  1. Claim-based reimbursement
    Providers self-report services and are reimbursed automatically.
  2. Pay-first, audit-later design
    Verification occurs months or years after funds are disbursed.
  3. Private delivery model
    States administer eligibility and payment, but do not deliver services directly.

This structure prioritizes speed, access, and continuity of care, particularly for vulnerable populations. It also creates an inherent vulnerability: fraud can scale faster than oversight.


II. What Was the Same Across States

Minnesota’s experience was not unique in its basic mechanics. Similar fraud dynamics appeared in California, New York, Illinois, and federal pandemic programs.

Across all jurisdictions:

  • Emergency COVID waivers loosened documentation and oversight
  • Provider enrollment was expedited
  • Site visits and in-person verification were suspended
  • Payment systems remained automated

Fraud exploited time gaps, not policy intent. These systems were designed to avoid denying care — not to stop sophisticated abuse in real time.


III. Where Minnesota Was Different

Minnesota’s case diverged from other states in three critical ways.

1. Scale and concentration

Other states experienced:

  • Thousands of small or mid-sized fraud cases
  • Losses spread across geography and programs

Minnesota experienced:

  • Highly organized networks
  • Multi-program overlap
  • Extraordinary dollar concentration per scheme

Federal prosecutors described the activity as “industrial-scale fraud”, not opportunistic abuse.


2. Early warnings before peak losses

Unlike many states where fraud was discovered after funds were gone, Minnesota agencies:

  • Flagged suspicious activity as early as 2019–2020
  • Documented implausible service volumes
  • Raised concerns internally and to federal partners

In the Feeding Our Future case — the catalyst for the broader investigation — state officials attempted to halt funding, triggering litigation that slowed enforcement. Payments continued while warning signs mounted.

This is a critical distinction: Minnesota saw the smoke before the fire peaked.


3. Fragmented authority

Minnesota’s human-services system is highly decentralized:

  • Provider approval, payment, audit, and enforcement are split across agencies
  • Counties and nonprofits operate with significant autonomy
  • Courts can limit administrative action during disputes

No single entity had both the authority and speed to stop payments decisively once fraud was suspected.


IV. When the Administration Became Aware — and How

The timeline matters.

  • 2019–early 2020: Program staff note irregular claims
  • Summer 2020: State agencies formally report concerns to federal partners
  • Late 2020: State attempts to terminate funding; litigation intervenes
  • February 2021: Referral to the FBI; federal criminal investigation begins
  • January 2022: FBI raids and indictments become public
  • 2022–2025: Investigation expands across multiple programs, revealing the larger scope

Senior state leadership was aware of suspected fraud well before public disclosure, but precise documentation of when the governor’s office was formally briefed remains unclear in the public record.

What is clear is that awareness preceded full intervention, and intervention lagged the growth of the schemes.


V. Why This Did Not Dominate the 2024 Election

Despite early knowledge within agencies, the issue did not meaningfully shape the 2024 election for several reasons:

  1. The full scale was not publicly known
    The $18 billion figure emerged only in late 2025.
  2. Early cases appeared isolated
    Feeding Our Future (~$300 million) looked large but contained.
  3. Complexity discouraged amplification
    The story lacked a simple narrative during a crowded election cycle.
  4. Investigations were ongoing
    Media and campaigns avoid claims not yet fully adjudicated.

By the time the magnitude became undeniable, the election had passed.


VI. Comparison to Texas: Same Programs, Different Outcomes

Texas administers the same federal programs — yet did not experience Minnesota-scale losses. The difference lies in governance design, not moral superiority.

1. Centralized authority

Texas operates through a strongly centralized Health and Human Services Commission. Provider enrollment, payment, and termination authority are consolidated.

Result: Payments can be halted quickly.


2. Provider enrollment rigor

Texas imposes:

  • Lengthy onboarding
  • Fingerprinting and ownership scrutiny
  • Financial viability checks

This slows access — and blocks shell entities.


3. Willingness to disrupt services

Texas is institutionally willing to:

  • Suspend providers first
  • Litigate later
  • Accept short-term service disruption

Minnesota showed greater hesitation, prioritizing continuity and legal caution.


4. Enforcement posture

Texas uses:

  • An aggressive Medicaid Fraud Control Unit
  • Early Attorney General involvement
  • Parallel civil and criminal actions

Fraud is treated as law enforcement first, not program management.


5. Blunt controls over elegant analytics

Texas relies on:

  • Hard caps
  • Billing thresholds
  • Manual overrides

The system is crude — but constraining. Minnesota relied more on trust and review.


VII. The Tradeoff at the Core

The contrast reveals a fundamental governance choice:

  • Minnesota prioritized access, trust, and decentralization
  • Texas prioritized control, authority, and risk tolerance

Neither model is clean. Both have costs. Only one prevented runaway scale.


VIII. What This Case Ultimately Reveals

This was not a failure of compassion, nor evidence of coordinated state wrongdoing. It was a failure of system architecture.

Modern aid systems that optimize for:

  • Speed
  • Equity
  • Access

must also invest in:

  • Real-time anomaly detection
  • Unified authority
  • Rapid payment suspension powers

Without those, fraud will always scale faster than oversight.


Conclusion

Minnesota did not invent fraud, and Texas did not eliminate it. The difference lies in how quickly each system can say “stop” when something goes wrong.

Minnesota saw the warning signs — but lacked the integrated authority to act decisively. Texas acts decisively — sometimes harshly — and accepts the consequences.

That is the real lesson of the Minnesota case: not who failed morally, but which systems are structurally capable of stopping abuse once it begins.

Texas Local Government: Sovereignty, Delegation, Fragmentation, and the State’s Return to Planning

A collaboration between Lewis McLain & AI

Only Two Sovereigns

Any serious discussion of Texas local government must begin with a foundational constitutional fact:

In the United States, there are only two levels of sovereign government:
the federal government and the states.

That is the full list.

Counties, cities, school districts, special districts, authorities, councils, boards, and commissions are not sovereign. They possess no inherent authority. They exist only because a state legislature has chosen to delegate specific powers to them, and those powers may be expanded, limited, preempted, reorganized, or withdrawn entirely.

Texas local government is therefore not a story of decentralization.
It is a story of delegated administration, followed—inevitably—by state-directed coordination when delegation produced excessive fragmentation.


The State of Texas as Sovereign and System Designer

The State of Texas is sovereign within its constitutional sphere. That sovereignty includes the authority to:

  • Create local governments
  • Define and limit their powers
  • Redraw or freeze their boundaries
  • Preempt their ordinances
  • Reorganize or abolish them

Local governments are not junior partners in sovereignty. They are instruments through which the state governs a vast and diverse territory.

From the beginning, Texas made a defining structural choice:
rather than consolidate government as complexity increased, it would delegate narrowly, preserve local identity, and retain sovereignty at the state level. That choice explains the layered system that followed.


Counties: The First Subdivision of State Power

Counties were Texas’s original subdivision of state authority, adopted after independence and statehood from Anglo-American legal traditions.

They were designed for a frontier world:

  • Sparse population
  • Horseback travel
  • Local courts
  • Recordkeeping
  • Elections
  • Law enforcement

During the 19th century, Texas rapidly carved itself into counties so residents could reach a county seat in roughly a day’s travel. By the early 20th century, the county map had largely frozen at 254 counties, a number that remains unchanged today.

Counties are constitutional entities, but they are governed strictly by Dillon’s Rule. They have no inherent powers, no residual authority, and little flexibility to adapt structurally. Once the county map was locked in place, counties became increasingly mismatched to Texas’s urbanizing reality—too small in some areas, too weak in others, and too rigid everywhere.

Rather than consolidate counties, Texas chose to work around them.


Dillon’s Rule: The Legal Engine of Delegation

The doctrine that made this system possible is Dillon’s Rule, named after John Forrest Dillon (1831–1914), Chief Justice of the Iowa Supreme Court and later a professor at Columbia Law School. His 1872 treatise, Commentaries on the Law of Municipal Corporations, emerged during a period of explosive city growth and widespread municipal corruption.

Dillon rejected the notion that local governments possessed inherent authority. He articulated a rule designed to preserve state supremacy:

A local government may exercise only
(1) powers expressly granted by the legislature,
(2) powers necessarily implied from those grants, and
(3) powers essential to its declared purpose—not merely convenient, but indispensable.
Any reasonable doubt is resolved against the local government.

Texas did not merely adopt Dillon’s Rule; it embedded it structurally. Counties, special districts, ISDs, and authorities operate squarely under Dillon’s Rule. Even cities escape it only partially through home-rule charters, and only to the extent the Legislature allows.

Dillon’s Rule explains why Texas governance favors many narrow entities over few powerful ones.


Cities: Delegated Urban Management, Not Local Sovereignty

As towns grew denser, counties proved incapable of providing urban services. The state responded by authorizing cities to manage:

  • Police and fire protection
  • Streets and utilities
  • Zoning and land use
  • Local infrastructure

Cities are therefore delegated urban managers, not sovereign governments.

Texas later adopted home-rule charters to give larger cities greater flexibility, but home rule is widely misunderstood. It does not reverse Dillon’s Rule. It merely allows cities to act unless prohibited—while preserving the Legislature’s power to preempt, override, or limit local authority at any time.

Recent state preemption is not a breakdown of the system. It is the system operating as designed.


Independent School Districts: Function Over Geography

Education exposed the limits of place-based governance earlier than any other function.

Counties were too uneven.
Cities were too political.
Education required stability, long planning horizons, and uniform oversight.

Texas responded by removing education from both counties and cities and creating Independent School Districts.

ISDs are:

  • Single-purpose governments
  • Granted independent taxing authority
  • Authorized to issue bonds
  • Subject to state curriculum and accountability mandates

ISDs do not answer to cities or counties. They answer directly to the state. This was one of Texas’s earliest and clearest moves toward functional specialization over territorial governance.


Special Districts: Precision Instead of Consolidation

As Texas industrialized and urbanized in the 20th century, the Legislature faced increasingly specific problems:

  • Flood control
  • Water supply
  • Drainage
  • Fire protection
  • Hospitals
  • Ports and navigation

Rather than expand general-purpose governments, Texas created special districts—single-mission entities with narrow authority and dedicated funding streams.

Special districts are not accidental inefficiencies. They reflect a deliberate state preference:

Solve problems with precision, not with consolidation.

The result was effectiveness and speed, at the cost of growing fragmentation.


MUDs and Authorities: Growth and Risk as State Policy

Municipal Utility Districts and authorities are often mistaken for private or quasi-private entities. Legally, they are governments.

MUDs:

  • Are created under state law
  • Levy taxes
  • Issue bonds
  • Are governed by elected boards
  • Provide essential infrastructure

They allow the state to:

  • Enable development before cities arrive
  • Finance infrastructure without municipal debt
  • Shift costs to future residents
  • Avoid restructuring counties

Similarly, transit authorities, toll authorities, housing authorities, and local government corporations exist to isolate risk, bypass constitutional debt limits, and accelerate projects. These are not loopholes. They are state-designed instruments.


The Consequence: Functional Fragmentation

By the mid-20th century, Texas governance had become highly functional—and deeply fragmented:

  • Fixed counties
  • Expanding cities
  • Independent ISDs
  • Thousands of special districts
  • Authorities operating alongside cities
  • Infrastructure crossing every boundary

The system worked locally, but failed regionally.

No entity could plan coherently across jurisdictions. Funding decisions conflicted. Infrastructure systems overlapped. Federal requirements could not be met cleanly. At this point, Texas made another defining choice.

It did not consolidate governments.
It pulled planning and coordination back upward, closer to the state.


Councils of Governments: State-Authorized Coordination

Beginning in the 1960s, Texas authorized Councils of Governments (COGs) to address fragmentation.

Today:

  • 24 COGs cover the entire state
  • Each spans multiple counties
  • Membership includes cities, counties, ISDs, and districts

COGs:

  • Have no taxing authority
  • Have no regulatory power
  • Have no police power

They exist to coordinate, not to govern—to reconnect what delegation had scattered. Their weakness is intentional. They sit conceptually just beneath the state, not beneath local governments.


MPOs: Transportation Planning Pulled Upward

Transportation forced an even clearer pull-back.

Texas has 25 Metropolitan Planning Organizations, designated by the state to comply with federal law. MPOs plan, prioritize, and allocate federal transportation funding. They do not build roads, levy taxes, or override governments.

MPOs act as planning membranes between federal mandates and Texas’s fragmented local structure.


Water: Where Texas Explicitly Rejected Fragmentation

Water planning most clearly demonstrates the limits of local delegation.

Texas spans 15 major river basins, with annual rainfall ranging from under 10 inches in the west to over 50 inches in the east. Water ignores counties, cities, ISDs, and districts entirely.

Texas responded by creating:

  • Approximately 23 river authorities, organized by watershed
  • 16 Regional Water Planning Areas, overseen by the Texas Water Development Board
  • A unified State Water Plan, adopted by the Legislature

Regional Water Planning Groups govern planning, not operations. Funding eligibility flows from compliance. This is state-directed regional planning with local execution.

Texas also created 95+ Groundwater Conservation Districts, organized by aquifer rather than politics—another instance of function overriding geography.


Public Health and Other Quiet Pull-Backs

Public health produced the same result. Disease ignores jurisdictional lines. Texas authorized county, city-county, and multi-county health districts to exercise delegated state police powers regionally.

The same pattern appears elsewhere:

  • Emergency management regions
  • Workforce development boards
  • Judicial administrative regions
  • 20 Education Service Centers
  • Air-quality nonattainment regions

Each represents the same logic:

  1. Delegation fragments
  2. Fragmentation impairs system performance
  3. The state restores coordination without transferring sovereignty

Final Synthesis

Texas local government did not evolve haphazardly. It followed a consistent philosophy:

  • Preserve sovereignty at the state level
  • Delegate functions narrowly
  • Avoid consolidation
  • Specialize relentlessly
  • Pull planning back upward when fragmentation becomes unmanageable

What appears complex or chaotic is actually layered intent.

Services are delegated downward.
Planning is pulled back upward.
Sovereignty never moves.

That tension—between delegation and coordination—is not a flaw in Texas government.
It is its defining structural feature.


Sydney Australia: An Updated Case Study on Two Previous Essays regarding a Serious Topic

A collaboration between Lewis McLain & AI

Public tragedies have a way of collapsing time. Old debates are reopened as if they were never had. Long-standing policies are treated as provisional. And political reflexes reassert themselves with a familiar urgency: something must be done, and whatever is done must be fast, visible, and legislative.

A recent Reuters report describing a mass shooting at a beachside gathering in Australia illustrates this pattern with uncomfortable clarity. The event itself was horrifying. The response was predictable. Within hours, political leaders were discussing emergency parliamentary sessions, tightening gun licensing laws, and revisiting a firearm regime that has been in place for nearly three decades.

What makes this episode especially instructive is not that it occurred in Australia, but that it occurred despite Australia’s reputation for having among the strictest gun control laws in the world. The country’s post-1996 framework—created in the wake of the Port Arthur massacre—has long been cited internationally as a model of decisive legislative action. Yet here, after decades of regulation, registration, licensing, and oversight, the instinctive answer remains the same: more law.

This essay treats the Australian response not as an anomaly, but as a continuation—and confirmation—of two arguments I have made previously: one concerning mass shootings as a systems failure rather than a purely legal failure, and another concerning what I have called “one-page laws”—the belief that complex social problems can be solved by concise statutes and urgent press conferences.


The Reuters Story, Paraphrased

According to Reuters, a deadly shooting at a public gathering in Bondi shocked Australians and immediately raised questions about whether the country’s long-standing firearms regime remains adequate. One of the suspects reportedly held a legal gun license and was authorized to own multiple firearms. In response, state and federal officials suggested that parliament might be recalled to consider reforms, including changes to license duration, suitability assessments, and firearm ownership limits.

The article notes that while Australia’s gun laws dramatically reduced firearm deaths after 1996, the number of legally owned guns has since risen to levels exceeding those prior to the reforms. Advocates argue that this growth, combined with modern risks, requires updated legislation. Political leaders signaled openness to acting quickly.

What the article does not do—and what most post-tragedy coverage does not do—is explain precisely how additional laws would have prevented this specific act, or how such laws would be meaningfully enforced without expanding surveillance, discretion, or intrusion into everyday life.

That omission is not accidental. It reflects a deeper habit in public governance.


The First Essay Revisited: Mass Shootings as Systems Failures

In my earlier essay on mass shootings, I argued that these events are rarely the result of a single legal gap. Instead, they emerge from systemic breakdowns: failures of detection, communication, intervention, and follow-through. Warning signs often exist. Signals are missed, dismissed, or siloed. Institutions act sequentially rather than collectively.

The presence or absence of one additional statute does little to alter those dynamics.

The Australian case reinforces this point. The suspect was not operating in a legal vacuum. The system already required licensing, registration, and approval. The breakdown did not occur because the law was silent; it occurred because law is only one input into a much larger human system.

When tragedy strikes, however, it is far easier to amend a statute than to admit that prevention depends on imperfect human judgment, social cohesion, mental health systems, community reporting, and inter-agency coordination. Laws are tangible. Systems are messy.


The Second Essay Revisited: The Illusion of One-Page Laws

My essay on one-page laws addressed a related but broader problem: the temptation to treat legislation as a substitute for governance.

One-page laws share several characteristics:

  • They are easy to describe.
  • They signal moral seriousness.
  • They create the appearance of action.
  • They externalize complexity.

The harder questions—Who enforces this? How often? With what discretion? At what cost? With what error rate?—are deferred or ignored.

The Australian response fits this pattern precisely. Proposals to shorten license durations or tighten suitability standards sound decisive, but they conceal the real burden: reviewing thousands of existing licenses, detecting future risk in people who have not yet exhibited it, and doing so without violating basic principles of fairness or due process.

The law can authorize action. It cannot supply foresight.


Where the Two Essays Converge

Taken together, these two arguments point to a shared conclusion: legislation is often mistaken for resolution.

Mass violence is not primarily a legislative failure; it is a detection and intervention failure. One-page laws feel comforting because they compress complexity into moral clarity. But compression is not the same as control.

Australia’s experience underscores a difficult truth: once a society has implemented baseline restrictions, further legislative tightening produces diminishing returns. The remaining risk lies not in legal gaps, but in human unpredictability. Eliminating that last fraction of risk would require levels of monitoring and preemption that most free societies rightly reject.

This is the trade-off no emergency session of parliament wants to articulate.


Why the Reflex Persists

The rush to legislate after tragedy is not irrational—it is political. Laws are visible acts of leadership. They reassure the public that order is being restored. Admitting that not every horror can be prevented without dismantling civil society is a harder message to deliver.

But honesty matters.

Governance is not the art of passing laws; it is the discipline of building systems that function under stress. When tragedy is followed immediately by legislative theater, it risks substituting symbolism for substance and urgency for effectiveness.


Conclusion

The Bondi shooting is not evidence that Australia’s gun laws have failed in some absolute sense. Nor is it proof that further legislation will succeed. What it is is a case study—one that reinforces two prior conclusions:

First, that mass violence persists even in highly regulated environments because it arises from human systems, not statutory voids.

Second, that one-page laws offer emotional relief but rarely operational solutions.

Serious problems deserve serious thinking. Not every response can be reduced to a bill number and a headline. And not every tragedy has a legislative cure.

The real challenge is resisting the comforting illusion that lawmaking alone is governance—and doing the slower, quieter, less visible work of strengthening the systems that stand between instability and catastrophe.

Population as the Primary and Predictable Driver of Local Government Forecasting

A collaboration between Lewis McLain & AI

A technical framework for staffing, facilities, and cost projection

Abstract

In local government forecasting, population is the dominant driver of service demand, staffing requirements, facility needs, and operating costs. While no municipal system can be forecast with perfect precision, population-based models—when properly structured—produce estimates that are sufficiently accurate for planning, budgeting, and capital decision-making. Crucially, population growth in cities is not a sudden or unknowable event.

Through annexation, zoning, platting, infrastructure construction, utility connections, and certificates of occupancy, population arrival is observable months or years in advance. This paper presents population not merely as a driver, but as a leading indicator, and demonstrates how cities can convert development approvals into staged population forecasts that support rational staffing, facility sizing, capital investment, and operating cost projections.


1. Introduction: Why population sits at the center

Local governments exist to provide services to people. Police protection, fire response, streets, parks, water, sanitation, administration, and regulatory oversight are all mechanisms for supporting a resident population and the activity it generates. While policy choices and service standards influence how services are delivered, the volume of demand originates with population.

Practitioners often summarize this reality informally:

“Tell me the population, and I can tell you roughly how many police officers you need.
If I know the staff, I can estimate the size of the building.
If I know the size, I can estimate the construction cost.
If I know the size, I can estimate the electricity bill.”

This paper formalizes that intuition into a defensible forecasting framework and addresses a critical objection: population is often treated as uncertain or unknowable. In practice, population growth in cities is neither sudden nor mysterious—it is permitted into existence through public processes that unfold over years.


2. Population as a base driver, not a single-variable shortcut

Population does not explain every budget line, but it explains most recurring demand when paired with a small number of modifiers.

At its core, many municipal services follow this structure:

Total Demand=α+β⋅Population

Where:

  • α (fixed minimum) represents baseline capacity required regardless of size (minimum staffing, governance, 24/7 coverage).
  • β (variable component) represents incremental demand generated by each additional resident.

This structure explains why:

  • Small cities appear “overstaffed” per capita (fixed minimum dominates).
  • Mid-sized and large cities stabilize into predictable staffing ratios.
  • Growth pressures emerge when population increases faster than capacity adjustments.

Population therefore functions as the load variable of local government, analogous to demand in utility planning.


3. Why population reliably predicts service demand

3.1 People generate transactions

Residents generate:

  • Calls for service
  • Utility usage
  • Permits and inspections
  • Court activity
  • Recreation participation
  • Library circulation
  • Administrative transactions (HR, payroll, finance, IT)

While individual events vary, aggregate demand scales with population.

3.2 Capacity, not consumption, drives budgets

Municipal budgets fund capacity, not just usage:

  • Staff must be available before calls occur
  • Facilities must exist before staff are hired
  • Vehicles and equipment must be in place before service delivery

Capacity decisions are inherently population-driven.


4. Population growth is observable before it arrives

A defining feature of local government forecasting—often underappreciated—is that population growth is authorized through public approvals long before residents appear in census or utility data.

Population does not “arrive”; it progresses through a pipeline.


5. The development pipeline as a population forecasting timeline

5.1 Annexation: strategic intent (years out)

Annexation establishes:

  • Jurisdictional responsibility
  • Long-term service obligations
  • Future land-use authority

While annexation does not create immediate population, it signals where population will eventually be allowed.

Forecast role:

  • Long-range horizon marker
  • Infrastructure and service envelope planning
  • Typical lead time: 3–10 years

5.2 Zoning: maximum theoretical population

Zoning converts land into entitled density.

From zoning alone, cities can estimate:

  • Maximum dwelling units
  • Maximum population at buildout
  • Long-run service ceilings

Zoning defines upper bounds, even if timing is uncertain.

Forecast role:

  • Long-range capacity planning
  • Useful for master plans and utility sizing
  • Typical lead time: 3–7 years

5.3 Preliminary plat: credible development intent

Preliminary plat approval signals:

  • Developer capital commitment
  • Defined lot counts
  • Identified phasing

Population estimates become quantifiable, even if delivery timing varies.

Forecast role:

  • Medium-high certainty population
  • First stage for phased population modeling
  • Typical lead time: 1–3 years

5.4 Final plat: scheduled population

Final plat approval:

  • Legally creates lots
  • Locks in density and configuration
  • Triggers infrastructure construction
  • Impact Fees & other costs are committed

At this point, population arrival is no longer speculative.

Forecast role:

  • High-confidence population forecasting
  • Suitable for annual budget and staffing models
  • Typical lead time: 6–24 months

5.5 Infrastructure construction: timing constraints

Once streets, utilities, and drainage are built, population arrival becomes physically constrained by construction schedules.

Forecast role:

  • Narrow timing window
  • Supports staffing lead-time decisions
  • Typical lead time: 6–18 months

5.6 Water meter connections: imminent occupancy

Water meters are one of the most reliable near-term indicators:

  • Each residential meter ≈ one household
  • Installations closely precede vertical construction

Forecast role:

  • Quarterly or monthly population forecasting
  • Just-in-time operational scaling
  • Typical lead time: 1–6 months

5.7 Certificates of Occupancy: population realized

Certificates of occupancy convert permitted population into actual population.

At this point:

  • Service demand begins immediately
  • Utility consumption appears
  • Forecasts can be validated

Forecast role:

  • Confirmation and calibration
  • Not prediction

6. Population forecasting as a confidence ladder

Development StagePopulation CertaintyTiming PrecisionPlanning Use
AnnexationLowVery lowStrategic
ZoningLow–MediumLowCapacity envelopes
Preliminary PlatMediumMediumPhased planning
Final PlatHighMedium–HighBudget & staffing
Infrastructure BuiltVery HighHighOperational prep
Water MetersExtremely HighVery HighNear-term ops
COsCertainExactValidation

Population forecasting in cities is therefore graduated, not binary.


7. From population to staffing

Once population arrival is staged, staffing can be forecast using service-specific ratios and fixed minimums.

7.1 Police example (illustrative ranges)

Sworn officers per 1,000 residents commonly stabilize within broad bands depending on service level and demand, also tied to known local ratios:

  • Lower demand: ~1.2–1.8
  • Moderate demand: ~1.8–2.4
  • High demand: ~2.4–3.5+

Civilian support staff often scale as a fraction of sworn staffing.

The appropriate structure is:Officers=αpolice+βpolicePopulationOfficers = \alpha_{police} + \beta_{police} \cdot PopulationOfficers=αpolice​+βpolice​⋅Population

Where α accounts for minimum 24/7 coverage and supervision.


7.2 General government staffing

Administrative staffing scales with:

  • Population
  • Number of employees
  • Asset inventory
  • Transaction volume

A fixed core plus incremental per-capita growth captures this reality more accurately than pure ratios.


8. From staffing to facilities

Facilities are a function of:

  • Headcount
  • Service configuration
  • Security and public access needs

A practical planning method:Facility Size=FTEGross SF per FTEFacility\ Size = FTE \cdot Gross\ SF\ per\ FTEFacility Size=FTE⋅Gross SF per FTE

Typical blended civic office planning ranges usually fall within:

  • ~175–300 gross SF per employee

Specialized spaces (dispatch, evidence, fleet, courts) are layered on separately.


9. From facilities to capital and operating costs

9.1 Capital costs

Capital expansion costs are typically modeled as:Capex=Added SFCost per SF(1+Soft Costs)Capex = Added\ SF \cdot Cost\ per\ SF \cdot (1 + Soft\ Costs)Capex=Added SF⋅Cost per SF⋅(1+Soft Costs)

Where soft costs include design, permitting, contingencies, and escalation.


9.2 Operating costs

Facility operating costs scale predictably with size:

  • Electricity: kWh per SF per year
  • Maintenance: % of replacement value or $/SF
  • Custodial: $/SF
  • Lifecycle renewals

Electricity alone can be reasonably estimated as:Annual Cost=SFkWh/SF$/kWhAnnual\ Cost = SF \cdot kWh/SF \cdot \$/kWhAnnual Cost=SF⋅kWh/SF⋅$/kWh

This is rarely exact—but it is directionally reliable.


10. Key modifiers that refine population models

Population alone is powerful but incomplete. High-quality forecasts adjust for:

  • Density and land use
  • Daytime population and employment
  • Demographics
  • Service standards
  • Productivity and technology
  • Geographic scale (lane miles, acres)

These modifiers refine, but do not replace, population as the base driver.


11. Why growth surprises cities anyway

When cities claim growth was “unexpected,” the issue is rarely lack of information. More often:

  • Development signals were not integrated into finance models
  • Staffing and capital planning lagged approvals
  • Fixed minimums were ignored
  • Threshold effects (new stations, expansions) were deferred too long

Growth that appears sudden is usually forecastable growth that was not operationalized.


12. Conclusion

Population is the primary driver of local government demand, but more importantly, it is a predictable driver. Through annexation, zoning, platting, infrastructure construction, utility connections, and certificates of occupancy, cities possess a multi-year advance view of population arrival.

This makes it possible to:

  • Phase staffing rationally
  • Time facilities before overload
  • Align capital investment with demand
  • Improve credibility with councils, auditors, and rating agencies

In local government, population growth is not a surprise. It is a permitted, engineered, and scheduled outcome of public decisions. A forecasting system that treats population as both a driver and a leading indicator is not speculative—it is simply paying attention to the city’s own approvals.


Appendix A

Defensibility of Population-Driven Forecasting Models

A response framework for auditors, rating agencies, and governing bodies

Purpose of this appendix

This appendix addresses a common concern raised during budget reviews, audits, bond disclosures, and council deliberations:

“Population-based forecasts seem too simplistic or speculative.”

The purpose here is not to argue that population is the only factor affecting local government costs, but to demonstrate that population-driven forecasting—when anchored to development approvals and adjusted for service standards—is methodologically sound, observable, and conservative.


A.1 Population forecasting is not speculative in local government

A frequent misconception is that population forecasts rely on demographic projections or external estimates. In practice, this model relies primarily on the city’s own legally binding approvals.

Population growth enters the forecast only after it has passed through:

  • Annexation agreements
  • Zoning entitlements
  • Preliminary and final plats
  • Infrastructure construction
  • Utility connections
  • Certificates of occupancy

These are public, documented actions, not assumptions.

Key distinction for reviewers:
This model does not ask “How fast might the city grow?”
It asks “What growth has the city already approved, and when will it become occupied?”


A.2 Population is treated as a leading indicator, not a lagging one

Traditional population measures (census counts, ACS estimates) are lagging indicators. This model explicitly avoids relying on those for near-term forecasting.

Instead, it uses development milestones as leading indicators, each with increasing certainty and narrower timing windows.

For audit and disclosure purposes:

  • Early-stage entitlements affect only long-range capacity planning
  • Staffing and capital decisions are triggered only at later, high-certainty stages
  • Near-term operating impacts are tied to utility connections and COs

This layered approach prevents premature spending while avoiding reactive under-staffing.


A.3 Fixed minimums prevent over-projection in small or slow-growth cities

A common audit concern is that per-capita models overstate staffing needs.

This model explicitly separates:

  • Fixed baseline capacity (α)
  • Incremental population-driven capacity (β)

This structure:

  • Prevents unrealistic staffing increases in early growth stages
  • Accurately reflects real-world minimum staffing requirements
  • Explains why per-capita ratios vary by city size

Auditors should note that this approach is more conservative than straight-line per-capita extrapolation.


A.4 Service standards are explicit policy inputs, not hidden assumptions

Population does not automatically dictate staffing levels. Staffing reflects policy decisions.

This model requires the city to explicitly state:

  • Response time targets
  • Service frequency goals
  • Coverage expectations
  • Hours of operation

As a result:

  • Changes in staffing can be clearly attributed to either population growth or policy change
  • Council decisions are transparently reflected in forecasts
  • The model separates “growth pressure” from “service enhancements or reductions”

This clarity improves accountability rather than obscuring it.


A.5 Facilities and capital projections follow staffing, not speculation

Another concern raised by reviewers is that population forecasts may be used to justify premature capital expansion.

This model deliberately enforces a sequencing discipline:

  1. Population approvals observed
  2. Staffing thresholds reached
  3. Facility capacity constraints identified
  4. Capital expansion triggered

Facilities are not expanded because population might grow, but because staffing—already justified by approved growth—can no longer be accommodated.

This mirrors best practices in asset management and avoids front-loading debt.


A.6 Operating cost estimates use industry-standard unit costs

Electricity, maintenance, custodial, and lifecycle costs are estimated using:

  • Per-square-foot benchmarks
  • Historical city utility data where available
  • Conservative unit assumptions

These are not novel or experimental methods. They are the same unit-cost techniques commonly used in:

  • CIP planning
  • Facility condition assessments
  • Energy benchmarking
  • Budget impact statements

Auditors should view these estimates as planning magnitudes, not precise bills—and that distinction is explicitly stated in the model documentation.


A.7 The model is testable and falsifiable

A major strength of this approach is that it can be validated against actual outcomes.

As certificates of occupancy are issued:

  • Actual population arrival can be compared to forecasts
  • Staffing changes can be reconciled
  • Utility consumption can be measured

This allows:

  • Annual recalibration
  • Error tracking
  • Continuous improvement

Models that can be tested and corrected are inherently more defensible than opaque judgment-based forecasts.


A.8 Why this approach aligns with rating-agency expectations

Bond rating agencies consistently emphasize:

  • Predictability
  • Governance discipline
  • Forward planning
  • Avoidance of reactive financial decisions

This framework demonstrates:

  • Awareness of growth pressures well in advance
  • Phased responses rather than abrupt spending
  • Clear linkage between approvals, staffing, and capital
  • Conservative treatment of uncertainty

As such, population-driven forecasting anchored to development approvals should be viewed as a credit positive, not a risk.


A.9 Summary for reviewers

For audit, disclosure, and governance purposes, the following conclusions are reasonable:

  1. Population growth in cities is observable years in advance through public approvals.
  2. Using approved development as a population driver is evidence-based, not speculative.
  3. Fixed minimums and service-level inputs prevent mechanical over-projection.
  4. Staffing precedes facilities; facilities precede capital.
  5. Operating costs scale predictably with assets and space.
  6. The model is transparent, testable, and adjustable.

Therefore:
A population-driven forecasting model of this type represents a prudent, defensible, and professionally reasonable approach to long-range municipal planning.


Appendix B

Consequences of Failing to Anticipate Population Growth

A diagnostic review of reactive municipal planning

Purpose of this appendix

This appendix describes common failure patterns observed in cities that do not systematically link development approvals to population, staffing, and facility planning. These outcomes are not the result of negligence or bad intent; they typically arise from fragmented information, short planning horizons, or the absence of an integrated forecasting framework.

The patterns described below are widely recognized in municipal practice and are offered to illustrate the practical risks of reactive planning.


B.1 “Surprise growth” that was not actually a surprise

A frequent narrative in reactive cities is that growth “arrived suddenly.” In most cases, the growth was visible years earlier through zoning approvals, plats, or utility extensions but was not translated into staffing or capital plans.

Common indicators:

  • Approved subdivisions not reflected in operating forecasts
  • Development tracked only by planning staff, not finance or operations
  • Population discussed only after occupancy

Consequences:

  • Budget shocks
  • Emergency staffing requests
  • Loss of credibility with governing bodies

B.2 Knee-jerk staffing reactions

When growth impacts become unavoidable, reactive cities often respond through hurried staffing actions.

Typical symptoms:

  • Mid-year supplemental staffing requests
  • Heavy reliance on overtime
  • Accelerated hiring without workforce planning
  • Training pipelines overwhelmed

Consequences:

  • Elevated labor costs
  • Increased burnout and turnover
  • Declining service quality during growth periods
  • Inefficient long-term staffing structures

B.3 Under-sizing followed by over-correction

Without forward planning, cities often alternate between two extremes:

  1. Under-sizing due to conservative or delayed response
  2. Over-sizing in reaction to service breakdowns

Examples:

  • Facilities built too small “to be safe”
  • Rapid expansions shortly after completion
  • Swing from staffing shortages to excess capacity

Consequences:

  • Higher lifecycle costs
  • Poor space utilization
  • Perception of waste or mismanagement

B.4 Obsolete facilities at the moment of completion

Facilities planned without reference to future population often open already constrained.

Common causes:

  • Planning based on current headcount only
  • Ignoring entitled but unoccupied development
  • Failure to include expansion capability

Consequences:

  • Expensive retrofits
  • Disrupted operations during expansion
  • Shortened facility useful life

This is one of the most costly errors because capital investments are long-lived and difficult to correct.


B.5 Deferred capital followed by crisis-driven spending

Reactive cities often delay capital investment until systems fail visibly.

Typical patterns:

  • Fire stations added only after response times degrade
  • Police facilities expanded only after overcrowding
  • Utilities upgraded only after service complaints

Consequences:

  • Emergency procurement
  • Higher construction costs
  • Increased debt stress
  • Lost opportunity for phased financing

B.6 Misalignment between departments

When population intelligence is not shared across departments:

  • Planning knows what is coming
  • Finance budgets based on current year
  • Operations discover impacts last

Consequences:

  • Conflicting narratives to council
  • Fragmented decision-making
  • Reduced trust between departments

Population-driven forecasting provides a common factual baseline.


B.7 Overreliance on lagging indicators

Reactive cities often rely heavily on:

  • Census updates
  • Utility consumption after occupancy
  • Service call increases

These indicators confirm growth after it has already strained capacity.

Consequences:

  • Persistent lag between demand and response
  • Structural understaffing
  • Continual “catch-up” budgeting

B.8 Political whiplash and credibility erosion

Unanticipated growth pressures often force councils into repeated difficult votes:

  • Emergency funding requests
  • Mid-year budget amendments
  • Rapid debt authorizations

Over time, this leads to:

  • Voter skepticism
  • Council fatigue
  • Reduced tolerance for legitimate future investments

Planning failures become governance failures.


B.9 Inefficient use of taxpayer dollars

Ironically, reactive planning often costs more, not less.

Cost drivers include:

  • Overtime premiums
  • Compressed construction schedules
  • Retrofit and rework costs
  • Higher borrowing costs due to rushed timing

Proactive planning spreads costs over time and reduces risk premiums.


B.10 Organizational stress and morale impacts

Staff experience growth pressures first.

Observed impacts:

  • Chronic overtime
  • Inadequate workspace
  • Equipment shortages
  • Frustration with leadership responsiveness

Over time, this contributes to:

  • Higher turnover
  • Loss of institutional knowledge
  • Reduced service consistency

B.11 Why these failures persist

These patterns are not caused by incompetence. They persist because:

  • Growth information is siloed
  • Forecasting is viewed as speculative
  • Political incentives favor short-term restraint
  • Capital planning horizons are too short

Absent a formal framework, cities default to reaction.


B.12 Summary for governing bodies

Cities that do not integrate development approvals into population-driven forecasting commonly experience:

  1. Perceived “surprise” growth
  2. Emergency staffing responses
  3. Repeated under- and over-sizing
  4. Facilities that age prematurely
  5. Higher long-term costs
  6. Organizational strain
  7. Reduced public confidence

None of these outcomes are inevitable. They are symptoms of not using information the city already has.


B.13 Closing observation

The contrast between proactive and reactive cities is not one of optimism versus pessimism. It is a difference between:

  • Anticipation versus reaction
  • Sequencing versus scrambling
  • Planning versus explaining after the fact

Population-driven forecasting does not eliminate uncertainty. It replaces surprise with preparation.


Appendix C

Population Readiness & Forecasting Discipline Checklist

A self-assessment for proactive versus reactive cities

Purpose:
This checklist allows a city to evaluate whether it is systematically anticipating population growth—or discovering it after impacts occur. It is designed for use by city management teams, finance directors, auditors, and governing bodies.

How to use:
For each item, mark:

  • Yes / In place
  • ⚠️ Partially / Informal
  • No / Not done

Patterns matter more than individual answers.


Section 1 — Visibility of Future Population

C-1 Do we maintain a consolidated list of annexed, zoned, and entitled land with estimated buildout population?

C-2 Are preliminary and final plats tracked in a format usable by finance and operations (not just planning)?

C-3 Do we estimate population by development phase, not just at full buildout?

C-4 Is there a documented method for converting lots or units into population (household size assumptions reviewed periodically)?

C-5 Do we distinguish between long-range potential growth and near-term probable growth?

Red flag:
Population is discussed primarily in narrative terms (“fast growth,” “slowing growth”) rather than quantified and staged.


Section 2 — Timing and Lead Indicators

C-6 Do we identify which development milestone triggers planning action (e.g., preliminary plat vs final plat)?

C-7 Are infrastructure completion schedules incorporated into population timing assumptions?

C-8 Are water meter installations or equivalent utility connections tracked and forecasted?

C-9 Do we use certificates of occupancy to validate and recalibrate population forecasts annually?

C-10 Is population forecasting treated as a rolling forecast, not a once-per-year estimate?

Red flag:
Population is updated only when census or ACS data is released.


Section 3 — Staffing Linkage

C-11 Does each major department have an identified population or workload driver?

C-12 Are fixed minimum staffing levels explicitly separated from growth-driven staffing?

C-13 Are staffing increases tied to forecasted population arrival, not service breakdowns?

C-14 Do hiring plans account for lead times (recruitment, academies, training)?

C-15 Can we explain recent staffing increases as either:

  • population growth, or
  • explicit policy/service-level changes?

Red flag:
Staffing requests frequently cite “we are behind” without reference to forecasted growth.


Section 4 — Facilities and Capital Planning

C-16 Are facility size requirements derived from staffing projections, not current headcount?

C-17 Do capital plans include expansion thresholds (e.g., headcount or service load triggers)?

C-18 Are new facilities designed with future expansion capability?

C-19 Are entitled-but-unoccupied developments considered when evaluating future facility adequacy?

C-20 Do we avoid building facilities that are at or near capacity on opening day?

Red flag:
Facilities require major expansion within a few years of completion.


Section 5 — Operating Cost Awareness

C-21 Are operating costs (utilities, maintenance, custodial) modeled as a function of facility size and assets?

C-22 Are utility cost impacts of expansion estimated before facilities are approved?

C-23 Do we understand how population growth affects indirect departments (HR, IT, finance)?

C-24 Are lifecycle replacement costs considered when adding capacity?

Red flag:
Operating cost increases appear as “unavoidable surprises” after facilities open.


Section 6 — Cross-Department Integration

C-25 Do planning, finance, and operations use the same population assumptions?

C-26 Is growth discussed in joint meetings, not only within planning?

C-27 Does finance receive regular updates on development pipeline status?

C-28 Are growth assumptions documented and shared, not implicit or informal?

Red flag:
Different departments give different growth narratives to council.


Section 7 — Governance and Transparency

C-29 Can we clearly explain to council why staffing or capital is needed before service failure occurs?

C-30 Are population-driven assumptions documented in budget books or CIP narratives?

C-31 Do we distinguish between:

  • growth-driven needs, and
  • discretionary service enhancements?

C-32 Can auditors or rating agencies trace growth-related decisions back to documented approvals?

Red flag:
Growth explanations rely on urgency rather than evidence.


Section 8 — Validation and Learning

C-33 Do we compare forecasted population arrival to actual COs annually?

C-34 Are forecasting errors analyzed and corrected rather than ignored?

C-35 Do we adjust household size, absorption rates, or timing assumptions over time?

Red flag:
Forecasts remain unchanged year after year despite clear deviations.


Scoring Interpretation (Optional)

  • Mostly ✅ → Proactive, anticipatory city
  • Mix of ✅ and ⚠️ → Partially planned, risk of reactive behavior
  • Many ❌ → Reactive city; growth will feel like a surprise

A city does not need perfect scores. The presence of structure, documentation, and sequencing is what matters.


Closing Note for Leadership

If a city can answer most of these questions affirmatively, it is not guessing about growth—it is managing it. If many answers are negative, the city is likely reacting to outcomes it had the power to anticipate.

Population growth does not cause planning problems.
Ignoring known growth signals does.


Appendix D

Population-Driven Planning Maturity Model

A framework for assessing and improving municipal forecasting discipline

Purpose of this appendix

This maturity model describes how cities evolve in their ability to anticipate population growth and translate it into staffing, facility, and financial planning. It recognizes that most cities are not “good” or “bad” planners; they are simply at different stages of organizational maturity.

Each level builds logically on the prior one. Advancement does not require perfection—only structure, integration, and discipline.


Level 1 — Reactive City

“We didn’t see this coming.”

Characteristics

  • Population discussed only after impacts are felt
  • Reliance on census or anecdotal indicators
  • Growth described qualitatively (“exploding,” “slowing”)
  • Staffing added only after service failure
  • Capital projects triggered by visible overcrowding
  • Frequent mid-year budget amendments

Typical behaviors

  • Emergency staffing requests
  • Heavy overtime usage
  • Facilities opened already constrained
  • Surprise operating cost increases

Organizational mindset

Growth is treated as external and unpredictable.

Risks

  • Highest long-term cost
  • Lowest credibility with councils and rating agencies
  • Chronic organizational stress

Level 2 — Aware but Unintegrated City

“Planning knows growth is coming, but others don’t act on it.”

Characteristics

  • Development pipeline tracked by planning
  • Finance and operations not fully engaged
  • Growth acknowledged but not quantified in budgets
  • Capital planning still reactive
  • Limited documentation of assumptions

Typical behaviors

  • Late staffing responses despite known development
  • Facilities planned using current headcount
  • Disconnect between planning reports and budget narratives

Organizational mindset

Growth is known, but not operationalized.

Risks

  • Continued surprises
  • Internal frustration
  • Mixed messages to council

Level 3 — Structured Forecasting City

“We model growth, but execution lags.”

Characteristics

  • Population forecasts tied to development approvals
  • Preliminary staffing models exist
  • Fixed minimums recognized
  • Capital needs identified in advance
  • Forecasts updated annually

Typical behaviors

  • Better budget explanations
  • Improved CIP alignment
  • Still some late responses due to execution gaps

Organizational mindset

Growth is forecastable, but timing discipline is still developing.

Strengths

  • Credible analysis
  • Reduced emergencies
  • Clearer governance conversations

Level 4 — Integrated Planning City

“Approvals, staffing, and capital move together.”

Characteristics

  • Development pipeline drives population timing
  • Staffing plans phased to population arrival
  • Facility sizing based on projected headcount
  • Operating costs modeled from assets
  • Cross-department coordination is routine

Typical behaviors

  • Hiring planned ahead of demand
  • Facilities open with expansion capacity
  • Capital timed to avoid crisis spending
  • Clear audit trail from approvals to costs

Organizational mindset

Growth is managed, not reacted to.

Benefits

  • Stable service delivery during growth
  • Higher workforce morale
  • Strong credibility with governing bodies

Level 5 — Adaptive, Data-Driven City

“We learn, recalibrate, and optimize continuously.”

Characteristics

  • Rolling population forecasts
  • Development milestones tracked in near-real time
  • Annual validation against COs and utility data
  • Forecast errors analyzed and corrected
  • Scenario modeling for alternative growth paths

Typical behaviors

  • Minimal surprises
  • High confidence in long-range plans
  • Early identification of inflection points
  • Proactive communication with councils and investors

Organizational mindset

Growth is a controllable system, not a threat.

Benefits

  • Lowest lifecycle cost
  • Highest service reliability
  • Institutional resilience

Summary Table

LevelDescriptionCore Risk
1ReactiveCrisis-driven decisions
2Aware, unintegratedLate responses
3StructuredExecution lag
4IntegratedFew surprises
5AdaptiveMinimal risk

Key Insight

Most cities are not failing—they are stuck between Levels 2 and 3. The largest gains come not from sophisticated analytics, but from integration and timing discipline.

Progression does not require:

  • Perfect forecasts
  • Advanced software
  • Large consulting engagements

It requires:

  • Using approvals the city already grants
  • Sharing population assumptions across departments
  • Sequencing decisions intentionally

Closing Observation

Cities do not choose whether they grow. They choose whether growth feels like a surprise or a scheduled event.

This maturity model makes that choice visible.

The Supreme Court and Texas Redistricting: Arguments, Standards, and the Court’s Conclusions

A collaboration between Lewis McLain & AI

For more than fifty years, Texas has been at the center of American redistricting law. Few states have produced as many major Supreme Court decisions shaping the meaning of the Voting Rights Act, the boundaries of racial gerrymandering doctrine, and—perhaps most significantly—the Court’s modern unwillingness to police partisan gerrymandering.

Two cases define the modern era for Texas: LULAC v. Perry (2006) and Abbott v. Perez (2018). Together, they reveal how the Court analyzes racial vote dilution, when partisan motives are permissible, how intent is inferred or rejected, and what evidentiary burdens challengers must meet.

At the heart of the Court’s reasoning is a recurring tension:

  • the Constitution forbids racial discrimination in redistricting,
  • the Voting Rights Act prohibits plans that diminish minority voting strength,
  • but the Court has repeatedly held that partisan advantage, even aggressive partisan advantage, is not generally unconstitutional.

Texas’s maps have allowed the Court to articulate, refine, and—many argue—narrow these doctrines.


I. LULAC v. Perry (2006): Partisan Motives Allowed, But Minority Vote Dilution Not

Background

In 2003, after winning unified control of state government, Texas Republicans enacted a mid-decade congressional redistricting plan replacing the court-drawn map used in 2002. It was an openly partisan effort to convert a congressional delegation that had favored Democrats into a Republican-leaning one.

Challengers argued:

  1. The mid-decade redistricting itself was unconstitutional.
  2. The legislature’s partisan intent violated the Equal Protection Clause.
  3. The plan diluted Latino voting strength in violation of Section 2 of the Voting Rights Act, particularly in old District 23.
  4. Several districts were racial gerrymanders, subordinating race to politics.

Arguments Before the Court

  • Challengers:
    • Texas had engaged in unprecedented partisan manipulation lacking a legitimate state purpose.
    • The dismantling of Latino opportunity districts—especially District 23—reduced the community’s ability to elect its preferred candidate.
    • Race was used as a tool to achieve partisan ends, in violation of Shaw v. Reno-line racial gerrymandering rules.
  • Texas:
    • Nothing in the Constitution forbids mid-decade redistricting.
    • Political gerrymandering, even when aggressive and obvious, was allowed under Davis v. Bandemer (1986).
    • Latino voters in District 23 were not “cohesive” enough to qualify for Section 2 protection.
    • District configurations reflected permissible political considerations.

The Court’s Decision

The Court’s ruling was a fractured opinion, but several clear conclusions emerged.

1. Mid-Decade Redistricting Is Constitutional

The Court held that states are not restricted to once-a-decade redistricting. Nothing in the Constitution or federal statute bars legislatures from replacing a map mid-cycle.
This effectively legitimized Texas’s overtly partisan decision to redraw the map simply because political control had shifted.

2. Partisan Gerrymandering Claims Remain Non-Justiciable (or Nearly So)

The Court again declined to articulate a manageable standard for judging partisan gerrymandering.
Justice Kennedy, writing for the controlling plurality, expressed concern about severe partisan abuses but concluded that no judicially administrable rule existed.

Key takeaway:
Texas’s partisan motivation, even if blatant, was not itself unconstitutional.

3. Section 2 Violation in District 23: Latino Voting Strength Was Illegally Diluted

This was the major substantive ruling.

The Court found that Texas dismantled an existing Latino opportunity district (CD-23) precisely because Latino voters were on the verge of electing their preferred candidate.
The legislature:

  • removed tens of thousands of cohesive Latino voters from the district,
  • replaced them with low-turnout Latino populations less likely to vote against the incumbent,
  • and justified the move under the guise of creating a new Latino-majority district elsewhere.

This manipulation, the Court held, denied Latino voters an equal opportunity to elect their candidate of choice, violating Section 2.

4. Racial Gerrymandering Claims Mostly Fail

The Court rejected most Shaw-type racial gerrymandering claims because plaintiffs failed to prove that race, rather than politics, predominated.
This reflects a theme that becomes even stronger in later cases:
when race and politics correlate—as they often do in Texas—challengers must provide powerful evidence that race, not party, drove the lines.


II. Abbott v. Perez (2018): A High Bar for Proving Discriminatory Intent

Background

After the 2010 census, Texas enacted new maps. A federal district court found that several districts were intentionally discriminatory and ordered Texas to adopt interim maps. In 2013, Texas then enacted maps that were largely identical to the court’s own interim maps.

Challengers argued that:

  1. The original 2011 maps were passed with discriminatory intent.
  2. The 2013 maps, though based on the court’s design, continued to embody the taint of 2011.
  3. Multiple districts across Texas diluted minority voting strength or were racial gerrymanders.

Texas argued that:

  • The 2013 maps were valid because they were largely adopted from a court-approved version.
  • Any discriminatory intent from 2011 could not be imputed to the 2013 legislature.
  • Plaintiffs bore the burden of proving intentional discrimination district by district.

The Court’s Decision

In a 5–4 ruling, the Supreme Court reversed almost all findings of discriminatory intent against Texas.

1. Burden of Proof Is on Challengers, Not the State

The Court rejected the lower court’s presumption that Texas acted with discriminatory intent in 2013 merely because the 2011 legislature had been found to do so.

Key Holding:
A finding of discriminatory intent in a prior map does not shift the burden; challengers must prove new intent for each new plan.

This significantly tightened the evidentiary bar.

2. Presumption of Legislative Good Faith

Chief Justice Roberts emphasized a longstanding principle:

Legislatures are entitled to a presumption of good faith unless challengers provide direct and persuasive evidence otherwise.

This presumption made it much harder to prove racial discrimination unless emails, testimony, or map-drawing files showed explicit racial motives.

3. Section 2 Vote Dilution Claims Largely Rejected

Challengers failed to show that minority voters were both cohesive and systematically defeated by white bloc voting in many districts.
The Court stressed the need for:

  • clear demographic evidence,
  • consistent voting patterns,
  • and demonstration of feasible alternative districts.

4. Only One District Violated the Constitution

The Court affirmed discrimination in Texas House District 90, where the legislature had intentionally moved Latino voters to achieve a specific racial composition.

But the Court rejected violations in every other challenged district.

5. Practical Effect: Courts Must Defer Unless Evidence Is Unusually Strong

Abbott v. Perez is widely viewed as one of the strongest modern statements of judicial deference to legislatures in redistricting—even when past discrimination has been found.

Justice Sotomayor’s dissent called the majority opinion “astonishing in its blindness.”


III. What These Cases Together Mean: Why the Court Upheld Texas’s Maps

Across both LULAC (2006) and Abbott (2018), a coherent theme emerges in the Supreme Court’s reasoning:

1. Partisan Gerrymandering Is Not the Court’s Job to Police

Unless partisan advantage clearly crosses into racial targeting, the Court will not strike it down.
Texas repeatedly argued political motives, and the Court repeatedly accepted them as legitimate.

2. Racial Discrimination Must Be Proven With Specific, District-Level Evidence

  • Plaintiffs must demonstrate that race—not politics—predominated.
  • Correlation between race and partisanship is not enough.
  • Evidence must address each district individually.

3. Legislatures Receive a Strong Presumption of Good Faith

Abbott v. Perez reaffirmed that courts should not infer intent from

  • prior discrimination,
  • suspicious timing,
  • or even foreseeable racial effects.

4. Section 2 Remedies Require Cohesive Minority Voting Blocs

LULAC (2006) found a violation only because evidence clearly showed cohesive Latino voters whose electoral progress was intentionally undermined.

5. Courts Avoid Intruding into “Political Questions”

The Court has repeatedly signaled reluctance to take over the political process.
This culminated in Rucho v. Common Cause (2019), where the Court held partisan gerrymandering claims categorically non-justiciable—a rule entirely consistent with how Texas cases were decided.


Conclusion: Why Texas Keeps Winning

Texas’s redistricting cases illustrate how the Supreme Court draws a sharp—and highly consequential—line:

  • Racial discrimination is unconstitutional, but must be proven with very specific evidence.
  • Partisan manipulation, even extreme manipulation, is permissible.
  • Courts defer heavily to state legislatures unless plaintiffs can clearly show that lawmakers used race as a tool, not merely politics.

In LULAC, challengers succeeded only where the evidence of racial vote dilution was unmistakable.
In Abbott v. Perez, they failed everywhere except one district because intent was not proven with the level of granularity the Court demanded.

The result is that Texas has repeatedly prevailed in redistricting litigation—not necessarily because its maps are racially neutral, but because the Court has set an unusually high bar for proving racial motive and has washed its hands of partisan claims altogether.

What Every Student Should Learn From Civics and Government — The Education of a Citizen

A collaboration between Lewis McLain & AI (4 of 4 in a Series)

If literature teaches us how to think,
and history teaches us where we came from,
and economics teaches us how choices shape the world,

then civics and government teach us how to live together in a free society.

When I was young, civics felt like a recitation of facts — three branches, the Constitution, the Bill of Rights. But I didn’t understand the deeper purpose or the tremendous responsibility that citizenship carries. I didn’t see that democracy is not self-sustaining. It requires informed people, disciplined judgment, and a shared understanding of how government actually works.

Years later, I came to realize that civics is not a list of facts to memorize — it is the operating manual for freedom.

This essay explores the essential civic knowledge students should learn, why it matters, and why it may be the single most endangered — and most important — subject today.


1. Understanding the Constitution — The Blueprint of American Government

Every student should know what the Constitution actually does.

At a minimum, students should understand:

  • Separation of powers
  • Checks and balances
  • Federalism (power divided between federal and state governments)
  • Individual rights
  • Limited government
  • Due process and equal protection

These aren’t abstract ideas. They’re the safeguards that prevent:

  • tyranny
  • abuse of power
  • unequal treatment
  • political retaliation
  • the erosion of liberty

Students should know why the Founders feared concentrated power. They should understand the debates between Hamilton and Jefferson, the compromises that made the system possible, and the principles that still hold it together.

A civically educated student knows what the government can do, what it cannot do, and what it should never be allowed to do.


2. How Laws Are Made — And Why It’s Supposed to Be Hard

A free people should know how laws move from idea to reality:

  • committee
  • debate
  • amendments
  • compromise
  • bicameral approval
  • executive signature
  • judicial review

Students should understand why the system has friction. The Founders designed lawmaking to be deliberate, slow, and thoughtful — not impulsive. This protects the nation from sudden swings of emotion, political fads, or the passions of the moment.

When students understand the process, they also understand:

  • why gridlock happens
  • why compromise is necessary
  • why no single branch can act alone
  • why courts exist as an independent check

This is how civics grounds expectations and tempers frustration.


3. Rights and Responsibilities — The Moral Core of Citizenship

Civics is not only about rights; it is also about responsibilities.

Students should understand:

  • free speech
  • free press
  • freedom of religion
  • right to vote
  • right to assemble
  • right to due process

But they should also learn:

  • the responsibility to vote
  • the responsibility to stay informed
  • the responsibility to obey just laws
  • the responsibility to serve on juries
  • the responsibility to hold leaders accountable
  • the responsibility to treat fellow citizens with dignity

A functioning democracy depends as much on personal virtue as it does on institutional design.


4. Local Government — The Level Students Understand the Least

Ironically, the level of government that affects daily life the most is the one students know the least about.

Students should understand:

  • cities, counties, school districts
  • zoning
  • local taxes
  • police and fire services
  • transportation systems
  • water and utility infrastructure
  • public debt and bond elections
  • local boards and commissions
  • how a city manager system works
  • how budgets are created and balanced

Local government is where the real work happens:

  • roads repaired
  • streets policed
  • water delivered
  • development approved
  • transit planned
  • emergency services coordinated
  • property taxes assessed

A civically educated adult understands where decisions are made — and how to influence them.


5. How Elections Work — Beyond the Headlines and Sound Bites

Every student should understand:

  • how voter registration works
  • how primaries differ from general elections
  • how the Electoral College works
  • how districts are drawn
  • what gerrymandering is
  • how campaign finance operates
  • the difference between federal, state, and local elections

They should learn how to evaluate:

  • candidates
  • platforms
  • ballot propositions
  • constitutional amendments
  • city bond proposals
  • school board decisions

Without civic education, elections become personality contests instead of informed deliberations.


6. The Balance Between Freedom and Order

Civics teaches students that government constantly manages tensions:

  • liberty vs. security
  • freedom vs. responsibility
  • majority rule vs. minority rights
  • government power vs. individual autonomy

These are not easy questions.
There are no perfect answers.
But a well-educated citizen understands the tradeoffs.

For example:

  • How far should free speech extend?
  • What powers should police have?
  • When should the state intervene in personal choices?
  • When does regulation protect people, and when does it stifle them?

Civics teaches students how to think through these issues, not what to believe.


7. Why Civics Matters Even More in the Age of AI

Artificial intelligence has changed the public square. It has amplified the need for civic understanding.

AI magnifies misinformation.

A civically uneducated population is easy to manipulate.

AI can imitate authority.

Only an informed citizen knows how to verify sources and test claims.

AI accelerates public emotion.

Civic education slows people down — it teaches them to evaluate before reacting.

AI makes propaganda more sophisticated.

Civics teaches how institutions work, which protects against deception.

Democracy cannot survive without an educated citizenry.

AI is powerful, but it is not responsible. Humans must be.

This is why civics — real civics — is urgently needed.


Conclusion: The Education of a Self-Governing People

History shows that democracies do not fall because enemies defeat them.
They fall because citizens forget how to govern themselves.

Civics teaches:

  • how power is structured
  • how laws are made
  • how rights are protected
  • how communities are built
  • how leaders should be chosen
  • how governments should behave
  • how citizens must participate

If literature strengthens the mind,
and history strengthens judgment,
and economics strengthens decision-making,

then civics strengthens the nation itself.

A free society is not sustained by wishes or by luck.
It is sustained by people who understand the system, value the responsibilities of citizenship, and guard the principles that keep liberty alive.

That is what civics is meant to teach —
and why it must remain at the heart of a complete education.

What Every Student Should Learn From Economics — The Missing Foundation for Adult Life

A collaboration between Lewis McLain & AI (3 of 4 in a Series)

If I struggled with literature when I was young, and if I misunderstood the purpose of history, then economics was the third great gap in my early education. I went through high school without any real understanding of how money works, how governments raise and spend it, how markets respond to incentives, or how personal financial decisions compound over time. I did not grasp the forces shaping wages, prices, interest rates, trade, taxation, inflation, or debt. I did get a good dose in college.

Looking back, I can see clearly:
Economics is the core life subject that students most need — and most rarely receive in a meaningful way.

What educators should want every student to know from required economics courses is nothing less than the mental framework necessary to navigate adulthood, evaluate public policy, make financial decisions, and understand why nations prosper or struggle. Economics is not simply business; it is the study of how people, families, governments, and societies make choices. A few years ago, I attended a multi-day course for high school teachers hosted by the Dallas Federal Reserve. It was an outstanding experience. Resources are there today, thank goodness!

This essay explores the essential economic understanding every student deserves — and why it matters now more than ever.


1. Scarcity, Choice, and Opportunity Cost: The Law That Governs Everything

The first truth of economics is painfully simple:
We cannot have everything we want.

Every choice is a tradeoff. Students should walk away understanding that:

  • Choosing to spend money here means not spending it there.
  • Choosing one policy means giving up another.
  • Choosing time for one activity means sacrificing time for something else.

Economics calls this opportunity cost — the value of the next best alternative you give up.

Once a student grasps this, the world becomes clearer:

  • Why governments cannot fund unlimited programs.
  • Why cities must prioritize.
  • Why individuals must budget.
  • Why nations cannot tax, borrow, or spend without consequences.

This one idea alone can save people from poor decisions, unrealistic expectations, and political manipulation.


2. How Markets Work — And What Happens When They Don’t

Every student should understand the basics of markets:

  • Supply and demand
  • Prices as signals
  • Competition as a force for innovation
  • Incentives as drivers of behavior

These are not theories — they are observable realities.

Examples:

  • When the price of lumber rises, construction slows.
  • When wages rise in one industry, workers shift into it.
  • When a product becomes scarce, people value it more.

Students should also learn about market failures, when markets do not work well:

  • Externalities (pollution)
  • Monopolies (lack of competition)
  • Public goods (national defense)
  • Information asymmetry (the mechanic knows more than the customer)

A well-educated adult should understand why some things are best left to markets, and others require collective action.


3. Money, Inflation, and the Hidden Forces That Shape Daily Life

Economics teaches students what money actually is — a medium of exchange, a store of value, a unit of account. It teaches why inflation happens, how interest rates work, and why credit matters.

This is the knowledge people most need to avoid lifelong mistakes:

  • High-interest debt
  • Payday loans
  • Adjustable-rate surprises
  • Over-borrowing
  • Misunderstanding mortgages
  • Under-saving for retirement
  • Falling for financial scams

Inflation, especially, is a quiet teacher.
Students should know:

  • Why prices rise
  • How purchasing power erodes
  • Why governments sometimes overspend
  • How central banks attempt to stabilize the economy

Without this understanding, adults become vulnerable to false promises, political slogans, and emotional decisions disguised as economic policy.


4. Government, Taxes, Debt, and the Economics of Public Choices

Students should understand how governments fund themselves:

  • income taxes
  • sales taxes
  • property taxes
  • corporate taxes
  • tariffs
  • fees and permits

They should know the difference between:

  • deficits and debt
  • mandatory vs. discretionary spending
  • expansionary vs. contractionary policy

And they should understand the consequences of borrowing:

  • interest costs
  • crowding out
  • inflationary risks
  • intergenerational burdens

A citizen who understands these concepts is harder to fool with slogans like:

  • “Free college for everyone!”
  • “We can tax the rich for everything!”
  • “Deficits don’t matter!”
  • “We can cut taxes without cutting services!”

Economics teaches that every promise has a cost — and someone must pay it.


5. Personal Finance: The Economics of Everyday Life

If there is one area where economics should be utterly practical, it is here.
Every student needs to understand:

  • budgeting
  • saving
  • compound interest
  • emergency funds
  • insurance
  • investing basics
  • retirement accounts
  • debt management
  • risk vs. reward

Without this, students walk into adulthood with no map — and they learn lessons the hard way.

One simple example:
$200 saved per month from age 22 to 65 at 7% grows to roughly $500,000.
The same $200 saved starting at age 35 grows to only ~$200,000.

Time matters.
Compounding matters.
Knowing this early changes lives.


6. Global Economics: Trade, Jobs, and National Strength

Students should understand why countries trade:

  • comparative advantage
  • specialization
  • global supply chains
  • exchange rates

They should understand what drives:

  • tariffs
  • sanctions
  • trade deficits
  • manufacturing shifts
  • labor markets

This is the foundation for understanding why:

  • some industries move overseas
  • some cities decline while others rise
  • automation replaces certain jobs
  • immigration affects labor supply
  • global shocks (like pandemics or wars) reshape economies

A student with global economic literacy is less fearful and more informed — and can better adapt to economic change.


7. Economics and Human Behavior

Economics is not just numbers — it is a window into human nature.

Students should learn:

  • why incentives matter
  • why people respond predictably to policy changes
  • why scarcity shapes decisions
  • why risk and reward are universal
  • why unintended consequences are common

For example:

  • Overly generous unemployment benefits can reduce the incentive to return to work.
  • Rent control can reduce housing supply, raising prices long-term.
  • Strict zoning can artificially inflate housing costs.
  • Tax breaks can shift business decisions but may not produce promised jobs.

Economics helps students see beyond intentions to outcomes.


8. Why Economics Matters Even More in the Age of AI

AI has changed everything — except human nature and economic reality.

AI can process data, but it cannot interpret incentives.

Only a human mind can understand why people behave as they do.

AI can forecast trends, but it cannot grasp consequences.

Consequences require judgment shaped by real-world understanding.

AI can make decisions quickly, but it cannot weigh tradeoffs ethically.

Economics teaches students how those tradeoffs work.

AI makes bad decisions faster when guided by people who don’t understand economics.

A poorly trained human with a powerful tool is dangerous.
A well-trained human with the same tool is wise.

Economics is the steadying force that helps society use AI responsibly.


Conclusion: The Blueprint for a Competent Adult

What educators want students to gain from economics is not technical jargon or narrow theories. It is an understanding of how the world works.

Economics teaches:

  • how choices shape outcomes
  • how incentives drive behavior
  • how money, markets, and governments interact
  • why prosperity is fragile and must be understood
  • how individuals, families, and nations manage limited resources
  • how to avoid financial mistakes and public illusions

If literature strengthens the mind and imagination,
and history strengthens judgment and citizenship,
economics strengthens decision-making — the backbone of adult life.

Together, they form the education every young person deserves before entering the real world. And the most important thing I hope you take away from this essay and my experience: college in general and high school in particular is where you launch into a lifetime of learning (and re-learning). Anything you see in this series that you judge you missed, go back and learn! LFM

Mass Shootings in America

A collaboration between Lewis McLain & AI

Hard Lessons, Real Stories, and the Ground-Level Solutions Law Enforcement Says Actually Work

Mass shootings in America have become a recurring national nightmare: predictable yet unpredictable, familiar yet devastating, common yet individually shattering. The politics surrounding them often emphasize blame, ideology, or emotion. What receives far less attention is the actual investigative DNA of these attacks — the timelines, the warnings, the coordination failures, and the moments when someone did intervene and stopped a massacre before it began.

To understand what truly works, we must look at the cases, not the slogans. The lives lost — and the lives saved — tell us more than any press conference or political tweet.

This essay explores the problem the way police, detectives, and federal threat-assessment specialists see it: case by case, pattern by pattern, weakness by weakness, and success by success.


I. What Mass Shootings Look Like Through Law Enforcement Eyes

Ask any detective with experience in threat assessment, and they will tell you a truth that ordinary Americans rarely hear:

“We almost always know who’s spiraling long before the shooting happens.
The problem is — nobody acts fast enough, firmly enough, or in sync.”

The datasets from the FBI, Secret Service, ATF, and state fusion centers show several common threads:

  • Shooters leak intent.
  • They study previous attacks.
  • They experience years of decline — socially, mentally, financially, emotionally.
  • They accumulate grievances.
  • Someone always notices something.

Law enforcement doesn’t describe these as “senseless crimes.”
They describe them as interceptable crises.


II. Real Cases That Reveal How Systems Fail — and Could Have Succeeded

These examples are not chosen to support any ideology.
They are simply the clearest windows into reality.


**1. SUTHERLAND SPRINGS, TX (2017)

A tragedy by bureaucracy — 26 killed, 22 injured**

  • Shooter convicted of domestic violence in the Air Force
  • Legally prohibited from firearm ownership
  • Air Force never uploaded the conviction into NICS
  • He passed background checks he should have failed

A church full of families was devastated because a clerk in a military office did not submit a form.

Law enforcement conclusion:
“Fix the reporting system and this shooter never gets a gun.”


**2. UVALDE, TX (2022)

Dozens of warnings — none acted on in time**

  • Multiple students reported terrifying social media posts
  • The shooter had photos of weapons, threats, violent messages
  • Friends said he was “spiraling”
  • A near-complete mental health collapse went unaddressed

The tragedy in Uvalde was compounded by a catastrophic police response — but the earlier failures are equally important: warning signs ignored, red flags dismissed, no early intervention team engaged.

Law enforcement conclusion:
“If someone had been empowered to intervene early, this kid never reaches that school door.”


**3. MIDLAND–ODESSA, TX (2019)

He failed a background check — then bought a weapon privately**

  • Shooter tried to buy a gun from a licensed dealer
  • He FAILED the background check
  • He then purchased a rifle through a private sale with no check
  • He spiraled, snapped during a traffic stop, and killed 7 people

Texas DPS and FBI called this case the “perfect storm of loopholes.”

Law enforcement conclusion:
“A failed background check should trigger a welfare follow-up.
Nobody checked on him.”


**4. FORT HOOD, TX (2009)

A shooter telegraphed his radicalization — nothing done**

  • Major Nidal Hasan repeatedly communicated extremist ideology
  • Colleagues reported him
  • Concerns were dismissed to avoid accusations of bias

This case shows what law enforcement calls “hesitation risk” — institutions afraid to act decisively.


**5. LAS VEGAS, NV (2017)

The outlier — almost no warning signs**

This shooter is the exception that proves the rule.
Law enforcement found:

  • no threats,
  • no manifesto,
  • no social media trail,
  • no extremist network.

He was wealthy, isolated, and meticulous.

Conclusion:
A tiny percentage of cases will bypass all prevention systems.
Most will not.


III. The Cases Where Mass Shootings Were Prevented — Proof That Prevention Works

These are not theories.
These are real, documented saves.


1. Richmond, VA (2022) — A July 4th massacre stopped cold

A man overheard a conversation about an attack planned on a holiday celebration.
He reported it.
Police uncovered weapons, plans, and a manifesto.

Lives saved: potentially hundreds.


2. Lubbock, TX (2021) — A 13-year-old stopped before carrying out school attack

The student had:

  • a detailed map
  • a written kill list
  • weapons ready
  • a manifesto

His grandmother found the notebook and reported him immediately.

Law enforcement conclusion:
“Family vigilance prevented mass casualties.”


3. Daytona Beach, FL (2019) — Threat assessment works

A student posted online:
“I’m going to shoot up the school.”

A classmate reported it.
Within hours:

  • police arrived
  • family cooperated
  • weapons were secured
  • boy received psychiatric evaluation

A textbook intervention.


4. Washington State (2015) — School attack prevented by a friend’s courage

A 15-year-old planned a Columbine-style attack.
He shared part of his plan with a friend.
The friend reported it, despite fear of social backlash.

Police discovered:

  • an AK-47
  • detailed plans
  • written threats

Friendship and courage saved a school.


5. Plano, TX Workplace Attack Prevented (2016)

A disgruntled employee expressed violent intent toward coworkers.
HR flagged it.
The company called police.
He was interviewed, weapons removed, and evaluated.

No attack occurred.


IV. What Law Enforcement Says Actually Works (Not Ideology — Evidence)

After decades of analysis, police agencies, FBI profilers, Secret Service behavioral specialists, and state threat-assessment units consistently identify five high-impact, realistic solutions.

Not bans.
Not fantasies.
Not slogans.

Real solutions grounded in actual casework.


1. Fix the Data — The Fastest Way to Save Lives

Cases like Sutherland Springs and Midland–Odessa show the role of:

  • missing convictions
  • unfiled restraining orders
  • unreported mental-health rulings
  • incorrect identifiers

Law enforcement calls this:

“The invisible failure that kills.”

The fix:
mandatory reporting audits and penalties for noncompliance.


2. County-Wide Threat Assessment Teams (The Best Tool We Have)

Teams combining:

  • sheriff’s office
  • schools
  • mental health
  • prosecutors
  • social workers

These teams already exist in:

  • Virginia (after Virginia Tech)
  • Florida (after Parkland)
  • Utah (statewide)
  • North Texas school districts

And they work.

They have stopped dozens of planned attacks by:

  • interviewing individuals
  • securing weapons temporarily
  • offering services
  • coordinating follow-up
  • de-escalating crises

This is the single most successful prevention method America has.


3. Mandatory Follow-Up on Credible Threat Reports

This is not punitive.
It is welfare-based intervention, used worldwide.

Every credible threat triggers:

  • a home visit
  • mental-health assessment
  • background check review
  • firearm-safety conversation (or temporary transfer if warranted)
  • follow-up plan

This would have intervened in:

  • Parkland
  • Uvalde
  • Santa Fe
  • Highland Park
  • El Paso
  • Dayton

Law enforcement overwhelmingly supports this.


4. Hardening Soft Targets — Without Militarizing Them

Realistic, non-intrusive upgrades:

  • shatter-resistant glass
  • classroom doors that lock from inside
  • unified communications (so responders hear the same thing)
  • interior safe zones
  • trained voluntary armed staff (Texas Guardian Program)
  • real-time law enforcement access to building layouts
  • festival/event perimeter redesigns

These upgrades prevented casualties in:

  • West Freeway Church of Christ, White Settlement, TX (armed volunteer stopped shooter in seconds, 2019)
  • Arvada, CO store attack (2021)
  • multiple school attacks where locked classrooms saved children

5. Breaking Adult Isolation — The Hidden Variable

Law enforcement notes a growing pattern: older, isolated, grievance-driven adults.

Examples:

  • Half Moon Bay (2023)
  • Buffalo supermarket shooter lived in complete isolation for years
  • Dayton shooter with obsessive ideation
  • Midland–Odessa shooter living alone in a squalid shack

Effective interventions:

  • workplace threat reporting
  • veteran wellness checks
  • aging men’s mental health programs
  • community navigator teams
  • training employers to recognize decompensation

These are low-cost and high-impact.


V. The Most Underreported Factor: Courage of Bystanders

Again and again, the preventions happened because someone —

  • a coworker
  • a teacher
  • a classmate
  • a grandmother
  • a friend
  • a roommate

chose to speak up.

Law enforcement calls this:

“The single most important variable in preventing mass violence.”

Bystanders save more lives than laws.


VI. The Moral Imperative: Replace Hopelessness With Method

Mass shootings aren’t random.
They aren’t unpredictable.
And they aren’t unsolvable.

What we need isn’t a perfect solution — it’s a functional system.

  • Competent reporting
  • Seamless coordination
  • Early intervention
  • Community eyes
  • Physical barriers that buy seconds
  • Adults who refuse to look away

These are the realistic, proven, workable solutions that law enforcement supports because they have watched them succeed in the field.


Conclusion: A Country That Can Change — If It Wants To

America doesn’t have to choose between freedom and safety.
It must choose between chaos and coordination.

The truth is painful but hopeful:

Most mass shootings are preventable.
Not with bans.
Not with magic.
But with systems that work and communities that care.

This is not a political argument.
It is a practical one — written in blood and proven by the cases where tragedy was avoided.

The question now is whether the country is willing to move beyond slogans and toward the solutions that actually save lives.


**APPENDIX

Texas Mass Violence Prevention Framework (2025 Edition)**
A State-Specific Policy, Law-Enforcement, and Case-Based Reference


I. Texas Case Studies (Successes and Failures)

Texas provides a uniquely large dataset for examining mass shootings: rural, suburban, urban, along the border, in oilfield regions, in major metros. These cases reveal consistent system gaps.


A. When the System Failed

1. Sutherland Springs (2017) — Data Failure

  • Domestic violence conviction not reported by the Air Force
  • Shooter passed background checks he should have failed
  • 26 dead, 22 wounded

Gap identified: Failure to report disqualifying convictions to NICS.
Texas impact: Dozens of counties still fail to upload mental-health adjudications consistently.


2. Santa Fe High School (2018) — No Warning System

  • 10 killed, 13 injured
  • Shooter had written violent fantasies, wore trench coat daily, showed disturbing art
  • None of it triggered intervention under existing school policies

Gap identified: Lack of integrated school threat-assessment teams pre-Parkland-style reforms.


3. El Paso Walmart Attack (2019) — Ideology, Isolation, and Online Radicalization

  • Shooter posted manifesto 20 minutes before attack
  • Family saw increasing withdrawal but did not see a way to intervene legally
  • 23 killed, 22 injured

Gap identified: No statewide reporting mechanism for family concern + lack of early intervention infrastructure.


4. Midland–Odessa (2019) — Failed Check + No Follow-Up

  • Shooter failed a background check
  • Still obtained rifle via private sale
  • Escaped all follow-up and monitoring
  • 7 killed, 25 injured

Gap identified: Texas has no “background check failure follow-up” protocol for welfare checks.


5. Uvalde (2022) — Warnings but No Coordinated Response

  • 30+ warning signs in digital posts
  • Peers alarmed
  • Threat assessment not mobilized
  • Failed command, failed entry, failed radios, failed leadership

Gaps identified:

  • early intervention
  • communication systems
  • unified command
  • school hardening
  • law-enforcement coordination

B. When the System Worked (Successful Texas Preventions)

1. Lubbock (2021) — Grandmother Stops School Attack

  • 13-year-old with kill list, weapons, and plans
  • Grandmother reported him immediately
  • Police confiscated weapons, intervened, managed mental-health services

Success factor: Courageous family reporting + rapid police response + cooperative mental health team.


2. Plano Workplace Threat (2016)

  • Employee threatened violence after disciplinary action
  • HR flagged it
  • Plano PD intervened
  • Shooter’s plan was disrupted without arrest

Success factor: Employer training + HR protocols + law enforcement follow-through.


3. White Settlement Church (2019)

  • Shooter killed two people during service
  • Armed volunteer neutralized the shooter within 6 seconds
  • Attack ended before a second reload

Success factor: Legitimated armed volunteer program (“Guardian”-style model) + training + mental readiness.


4. North Texas High School Plots Disrupted (Multiple 2020–2024)

School districts in Denton, Collin, and Tarrant Counties thwarted more than a dozen serious plots because of:

  • school resource officers
  • student tips
  • routine digital threat monitoring
  • counseling interventions
  • multi-party threat assessment teams

Success factor: Post-Parkland statewide reforms requiring threat assessment teams in ISDs.


II. Texas Law Enforcement Consensus (Interviews, Briefings & Reports)

Across:

  • Texas Police Chiefs Association
  • County Sheriffs
  • DPS briefings
  • Texas School Safety Center
  • Fusion centers
  • Large-city PDs (Houston, Dallas, San Antonio, Austin, Fort Worth)

The consistent message is this:

“Almost every mass shooting is preventable if someone can act early —
but the system doesn’t empower people to act.”

Their concerns fall into five categories:


1. Lack of Consequences for Non-Reporting

Agencies that fail to upload disqualifying records face no meaningful penalties.
Sheriffs say:

“If reporting is optional, tragedy is inevitable.”


2. Fragmented Threat Assessment

Texas has strong school systems, but adult threat assessment is weak.

DPS Colonel Steven McCraw has repeatedly said:

“Adult shooters fall completely outside school safety structures.”


3. Soft Targets and Weak Facilities

Sheriffs in rural counties often point out:

“Our churches, fairs, festivals, and schools were built before the era of mass violence.”

Meaning: physical layouts are outdated.


4. Too Many Lone, Isolated, Angry Adults

Texas PDs say they increasingly deal with:

  • divorced, isolated adult men
  • untreated mental illness
  • workplace grievances
  • housing-insecure individuals
  • online radicalization across the spectrum

This is the modern offender profile — not simply youth shooters.


5. No Statewide Mechanism for “Background Check Failures”

Law enforcement consistently recommends:

“If someone fails a background check, they should receive a welfare check.
Not to seize weapons — but to understand the risk.”

This one reform would have prevented Midland–Odessa.


III. Concrete State-Level Solutions (Non-Ideological and Realistic)

These are politically feasible, budget-achievable, and supported by law enforcement.


1. Mandatory Reporting Compliance Audits

Texas should audit:

  • county clerks
  • JP courts
  • district courts
  • mental-health orders
  • protective orders

Goal: ensure all disqualifying convictions enter NICS/DPS within 24–72 hours.

Cost: low
Impact: high


2. “Texas Adult Threat Assessment Teams” (T-ATAT)

Modeled after school threat teams but focused on adults.

Teams would include:

  • Sheriff’s office
  • Constables
  • Mental health mobile crisis units
  • Prosecutors
  • Social workers
  • Veteran services
  • Employers (optional)

Focus:

  • early intervention
  • de-escalation
  • temporary safety plans
  • coordinated follow-up

This responds to half the Texas shooter profile, which is adult male isolation.


3. Background Check Failure Protocol (Welfare Check + Mental Health Screen)

If a Texan:

  • fails a background check
  • attempts an illegal straw purchase
  • makes “alarmingly specific” threats

…then DPS notifies the sheriff in that county.

Sheriff conducts:

  • welfare check
  • mental-health referral (if needed)
  • firearm safety conversation
  • case documentation

No confiscation required.
No criminal charge required.

Simply breaking the isolation saves lives.


4. Realistic Target Hardening for Schools, Churches & Events

Low-cost priorities:

  • shatter-resistant entry glass
  • interior locking mechanisms
  • campus-wide communication systems
  • unified law enforcement radio channels
  • updated maps accessible digitally to responders
  • controlled-access vestibules
  • volunteer security programs

These already saved lives at:

  • White Settlement church
  • West Texas schools where locked classrooms stopped entry
  • multiple thwarted school plots

5. Community Navigator Teams for Isolated Adults

Texas sheriffs strongly endorse pilot programs in:

  • rural counties
  • oilfield regions
  • borderside colonias
  • veteran-dense areas

Navigators perform:

  • wellness checks
  • reconnecting individuals to family, church, social services
  • employment referrals
  • mental health connection
  • regular follow-up

This is cheap and effective.


6. Employer Training Statewide (especially in high-stress industries)

Texas mass violence often emerges from:

  • trucking
  • energy sector
  • distribution warehouses
  • food processing plants
  • call centers

Employers need:

  • threat-recognition training
  • HR escalation pathways
  • connections to sheriff’s offices

This prevented the Plano case.


IV. “What Good Intervention Looks Like” — Texas Examples

Case A: North Texas High School Plot Stopped (2023)

  • Student posted detailed shooting threat
  • Classmates reported immediately
  • Threat team met same day
  • Parents cooperated
  • Police conducted home visit
  • Weapons removed temporarily
  • Student entered crisis counseling
  • No criminal record created

Outcome:
No violence.
Family relieved.
School safe.
Child receives long-term care.


Case B: Rural West Texas Veteran (2020)

  • Veteran in crisis making alarming comments
  • Neighbor reported
  • Sheriff’s deputy and veteran liaison responded
  • Weapons temporarily transferred to brother
  • Veteran placed in VA crisis stabilization program
  • Follow-up by navigator team

Outcome:
Incident avoided.
Veteran stabilized.
No arrests.
Family grateful.


Case C: Dallas-Area Workplace (2022)

  • Worker said he wanted to “take out” supervisors
  • HR trained under Texas Workplace Safety Pilot Program
  • HR called police
  • PD interviewed, implemented voluntary safety plan
  • Mental health assistance provided
  • Employer changed his job assignment

Outcome:
No violence.
Employee recovered, remained employed.


V. Statewide Recommended Implementation Plan

Year 1 (Fast Wins)

  • NICS reporting audits
  • Texas Adult Threat Assessment Teams (pilot in 8 major counties)
  • DCFS and mental health reporting refreshers
  • Standardized threat reporting hotline

Year 2 (Scalable Programs)

  • statewide employer training
  • community navigator expansion
  • school physical-security retrofits
  • integrated law enforcement communications

Year 3 (Long-Term Infrastructure)

  • full digital courthouse → DPS transmission
  • unified statewide threat-assessment database
  • mental-health telecrisis network across rural counties

VI. “Texas Principles” for Mass Violence Prevention

Law enforcement leaders often summarize what works into three Texas-style principles:

**1. “If it’s predictable, it’s preventable.”

Almost every attacker reveals intent.

**2. “You can’t fix what you don’t see.”

Isolation breeds violence — intervention disrupts it.

**3. “Don’t wait for perfect. Act when something seems wrong.”

Prevention happens early or not at all.


VII. Conclusion of Appendix

Texas is poised to lead the nation with non-ideological, realistic, enforceable policies that:

  • honor the Second Amendment
  • respect local control
  • prioritize law enforcement input
  • rely on early intervention, not confiscation
  • strengthen communities, not weaken them
  • save lives without dividing the country

Mass violence is not an unsolved mystery.
It is a coordination problem, a communication problem, and at its core, a human connection problem.

Texas can fix these.
Texas has the tools.
Texas has the cases.
And now, Texas has the blueprint.



**APPENDIX B

“What I’ve Learned After 20 Years Responding to Mass Violence”
A Law Enforcement Perspective

I’ve worn a badge in Texas for more than two decades. I’ve seen quiet towns shaken by unspeakable violence, and I’ve seen ordinary citizens step up to prevent tragedies the public will never hear about. I’ve walked through crime scenes that will stay with me until the day I retire, and I’ve sat at kitchen tables with parents who have no words left except, “Why?”

After all this time, I’ve learned that nearly everything the public argues about is only a sliver of the truth. Mass violence doesn’t happen because one law wasn’t passed or because one political side is right and the other is wrong. It happens because systems fail, people look away, warnings go unreported, and institutions are afraid to act when someone is spiraling.

This is what it looks like from where I stand.


I. “We Almost Always Know”

The hardest truth is this:

In most cases, the shooter was on someone’s radar long before they opened fire.

I’m not talking about clairvoyance.
I’m talking about patterns.

In case after case, we’ve seen:

  • threats posted online
  • violent fantasies shared with friends
  • domestic disturbances
  • histories of grievance and obsession
  • escalating isolation
  • coworker concerns
  • school warnings
  • welfare checks that never happened
  • mental health breaks that went untreated

We call these “pre-incident indicators.”
They’re real. They’re measurable. And they’re almost always present.

The tragedy is not that we don’t know —
it’s that we don’t act fast enough or in sync enough.


II. “It’s Not the Gun — It’s the Spiral”

I’ve taken more guns off the street than I can remember. Hunting rifles. Handguns. A few illegally modified weapons. And yes, rifles with large magazines.

But here’s the truth you learn after 20 years:

It’s never the gun in isolation.
It’s the downward slide no one interrupts.

Shooters are rarely “snapped” individuals.
They are individuals who decline over months or years.

We see:

  • isolation
  • job loss
  • family collapse
  • grievance accumulation
  • untreated depression
  • anger fixation
  • obsession with previous shooters
  • social withdrawal
  • personality change

By the time they act violently, they’ve been at the bottom of a well for a long time—and no one lowered a rope.

If you want to know what law enforcement believes will make the biggest difference, it’s this:

Catch the spiral before the crash.


III. “Families Know First”

I wish the public understood how many times a parent, sibling, spouse, or grandparent has quietly whispered to me:

“I’m scared of what he might do.”
“He’s not the same person anymore.”
“He talks about violence.”

But they didn’t know what to do.
They didn’t want their family member arrested.
They didn’t want to “ruin his life.”
They didn’t know if it was serious.
Sometimes they were embarrassed.

Here’s what I want every Texan to know:

Calling us doesn’t automatically mean a criminal charge.
Most of the time, early intervention means:

  • mental-health evaluation
  • voluntary firearm transfer
  • crisis services
  • counseling
  • follow-ups
  • family coordination

The public imagines a SWAT raid.
What usually happens is a conversation at the kitchen table.


IV. “Threat Assessment Teams Work — Better Than Anything Else We’ve Tried”

The best tool we have isn’t complicated:

Get the right people around the same table before someone gets hurt.

A threat assessment team — the way we run them in parts of Texas — includes:

  • detectives
  • school representatives
  • mental-health clinicians
  • prosecutors
  • social service partners
  • sometimes clergy or veterans’ liaisons

When these teams function, they catch things that no single agency would ever catch alone.

I’ve seen teams:

  • talk a teenager out of a violent plan
  • get an unstable adult into treatment
  • mediate workplace grievances
  • defuse domestic crises
  • remove firearms voluntarily
  • help families reconnect
  • stop ideologically motivated plots

And the public never knows because nothing bad happened.

I can tell you without hesitation:

Threat assessment has prevented more mass shootings than any law ever passed.


V. “Follow-Up Saves Lives”

One of the biggest failures in this country is the belief that if someone doesn’t break the law, there’s nothing we can do.

That’s false.

We can:

  • check on them
  • talk to them
  • bring mental-health professionals
  • involve the family
  • secure weapons voluntarily
  • create a safety plan
  • follow up again and again

The cases that haunt me are the ones where the warning signs were clear, someone called, and then the file sat on a desk — or was never shared with the people who could act.

The most effective thing we can do is simple:

If a credible threat comes in, someone must check on that person within 24 hours.

Not to arrest.
To assess.
To intervene early.


VI. “You Don’t Need to Militarize a School to Make It Safe”

I’ve been inside dozens of Texas schools.
Some built in the 1960s with glass doors that could be breached by a lawn chair.
Some built after 2018 with lockdown doors, radio repeaters, and secure vestibules.

You know what helps?

  • classroom doors that lock from the inside
  • shatter-resistant glass
  • clear communication systems
  • unified law enforcement radio channels
  • controlled access
  • trained school staff who know what to do

You know what doesn’t help?

  • finger-pointing
  • slogans
  • political theater

Small, inexpensive improvements save more lives than any sweeping overhaul.


VII. “We Need Community, Not Just Cops”

People assume mass violence is a police problem.
It isn’t.

It’s a community problem.

The most important actors in prevention are:

  • families
  • coworkers
  • HR officers
  • school counselors
  • pastors
  • friends
  • neighbors

You see the cracks before we do.
You see the shift in behavior.
You hear the disturbing comment.
You watch the decline.

And when you call us, you give us a chance to help before the damage is done.


VIII. “The Truth No One Wants to Admit”

I’ve seen evil.
I’ve seen pain.
I’ve seen things I won’t describe in a public essay.

But I’ve also seen:

  • a grandmother save a school
  • a coworker prevent a workplace massacre
  • a pastor de-escalate a veteran in crisis
  • a teacher stop a tragedy with one phone call
  • a church security volunteer act in six seconds to end a deadly attack

The truth is this:

Mass shootings are not unstoppable.
They are unaddressed.
There’s a difference.

We can fix this.
We know how.
We have the tools.
We just have to use them consistently.


IX. My Message to Texans

If you want to save lives, don’t start with Congress.
Start with:

  • local coordination
  • early intervention
  • better reporting
  • stronger families
  • human connection
  • courage when something feels wrong

Texas has already stopped attacks because the right person spoke up.
And Texas has suffered attacks because the right person stayed silent.

We can change that.


X. Final Word

I’ve carried children out of classrooms.
I’ve stepped over shell casings in churches.
I’ve held the hands of grieving parents.
I’ve watched communities heal with patience, courage, and love.

I don’t want to see another town go through this.
And we don’t have to.

Not if we act early.
Not if we act together.
Not if we see the warning signs and refuse to ignore them.

Most shooters are preventable long before a trigger is ever pulled.
Our job is to step in before someone reaches the point of no return.

And that is something Texas can lead the nation in doing — not through division, but through determination.