The Smith-Mundt revised - Transparency, Accountability, and Journalistic Integrity Act

The Smith-Mundt revised - Transparency, Accountability, and Journalistic Integrity Act

Purpose

This law aims to:

  1. Update the Smith-Mundt Act
  2. Stop domestic propaganda
  3. Ensure transparency and accountability in media and diplomacy
  4. Protect First Amendment rights
  5. Safeguard digital privacy
  6. Prevent unconstitutional surveillance
  7. Ensure AI safety and security

Key Provisions

  1. Ban on Domestic Propaganda

    • No government funds for influencing US citizens
    • Clear labeling of public diplomacy content
  2. First Amendment Protections

    • No censorship or suppression of protected speech
    • Warrants required for user data
    • No coercion to remove content
  3. Journalistic Standards

    • Mainstream media adheres to ethics and standards
    • Penalties for spreading propaganda
  4. Media Ownership Transparency

    • Disclosure of ownership structures
    • No hidden influences or conflicts
  5. Independent Media Support

    • Funds for independent media and investigative journalism
    • Diversity in media ownership and perspectives
  6. Digital Privacy Protections

    • No unconstitutional surveillance
    • Warrants required for digital information
  7. AI Safety and Security

    • Transparent AI development
    • No bias or tampering
    • Encryption and protection

Accountability and Oversight

  1. AI Oversight Agency (AOA)
  2. Independent Development Team (IDT)
  3. Public-Private Partnerships (PPP)
  4. Congressional oversight committees
  5. Transparency reports and whistleblower protections

Implementation

  1. Timeline: 180 days for implementation
  2. Key performance indicators (KPIs)
  3. International collaboration
  4. Impact assessment on employment and education

Conclusion

Effective implementation requires cooperation between agencies.

Smith-Mundt Transparency, Accountability, and Journalistic Integrity Act

Section 1. Short Title

This Act may be cited as the “Smith-Mundt Transparency, Accountability, and Journalistic Integrity Act.”

Section 2. Purpose

To modernize the Smith-Mundt Act, prevent domestic propaganda, ensure transparency and accountability in public diplomacy and media, safeguard First Amendment rights, protect digital privacy, prevent unconstitutional surveillance, and ensure AI safety and security.

Section 3. Definitions

(1) “Public diplomacy” means information and cultural programs to promote US interests abroad.

(2) “Domestic propaganda” means information or messaging intended to influence US citizens.

(3) “Protected speech” means speech protected under the First Amendment.

(4) “Mainstream media” (MSM) means major news outlets and media organizations.

(5) “Search engines” means online search platforms.

(6) “Apps” means mobile and web applications.

(7) “AI” means artificial intelligence systems.

(8) “Treason” means actions that undermine national security or constitutional rights.

Section 4. Prohibition on Domestic Propaganda

(a) No funds authorized by this Act shall be used to disseminate information or messaging intended to influence US citizens.

(b) All public diplomacy content shall clearly identify its source and purpose.

Section 5. First Amendment Protections

(a) The US government shall not:

(1) Censor or suppress protected speech

(2) Demand user data without warrant

(3) Coerce platforms to remove content

(b) Social media platforms, MSM, search engines, and apps shall:

(1) Publish transparent moderation guidelines

(2) Provide clear content removal procedures

(3) Offer appeals process for removed content

Section 6. Journalistic Standards and Accountability

(a) MSM shall adhere to established journalism ethics and standards.

(b) Penalties for spreading domestic propaganda shall be established.

Section 7. Media Ownership Transparency

(a) Media outlets shall disclose ownership structures.

(b) Hidden influences or conflicts of interest shall be prohibited.

Section 8. Independent Media Support

(a) Funds shall be allocated for independent media outlets and investigative journalism initiatives.

(b) Diversity in media ownership and perspectives shall be promoted.

Section 9. Communication Database

(a) Government entities shall maintain an official database of communications with social media, MSM, search engines, and apps.

(b) Database shall be accessible to Congressional oversight committees.

Section 10. Search Engine Neutrality

(a) Search engines shall not:

(1) Restrict or influence search results

(2) Prioritize content based on government influence

(b) Search engines shall:

(1) Implement transparent algorithms

(2) Disclose content moderation policies

Section 11. Digital Privacy Protections

(a) Government entities shall not engage in unconstitutional surveillance.

(b) Digital information collection requires court-ordered warrants.

(c) Probable cause required for digital surveillance.

(d) Oversight committee access to digital information.

Section 12. Repeal of Patriot Act Provisions

(a) Sections 206, 215, and 6001 of the Patriot Act are repealed.

(b) Unconstitutional surveillance provisions are nullified.

Section 13. AI Safety and Security

(a) AI systems shall be designed with integrity and transparency.

(b) AI systems shall not be programmed with bias.

(c) AI systems shall undergo strict encryption.

(d) AI systems shall be protected from influence and tampering.

Section 14. AI Oversight

(a) Three separate and compartmentalized AI entities shall be established:

  • AI Review Entity (ARE)

  • AI Oversight Entity (AOE)

  • AI Operations Entity (AOO)

(b) AI Review Entity (ARE):

(1) Conducts regular audits of AI systems

(2) Identifies and reports potential security threats and vulnerabilities

(c) AI Oversight Entity (AOE):

(1) Monitors AI systems for tampering or unauthorized access

(2) Ensures AI systems remain uncompromised

(3) Detects and alerts potential security breaches

(d) AI Operations Entity (AOO):

(1) Oversees AI operations

(2) Addresses operational issues

(3) Optimizes AI system performance

(e) Automated Alert System:

(1) Triggers alerts to Congressional oversight committees and media outlets for any detected changes to AI systems

(2) Alerts include description of changes, potential impact on constitutional rights, and recommended actions

(f) Change Approval Process:

(1) All changes to AI systems require approval through discussion and voting by Congressional oversight committees

(2) Voting process requires simple majority for approval and establishes a tie-breaking mechanism

(g) Transparency Reports:

(1) AI Oversight Entity (AOE) publishes quarterly transparency reports

(2) Reports include changes detected and approved, changes detected and rejected, and security breaches and incidents

Section 15. AI Ethics Board

(a) An independent AI Ethics Board shall be established.

(b) Board shall:

(1) Develop AI ethics guidelines

(2) Review AI system compliance

(3) Investigate AI-related complaints

Section 16. AI Literacy Training

(a) Mandatory AI literacy training for government officials.

(b) Training shall cover:

(1) AI basics

(2) AI ethics

(3) AI-related laws and regulations

Section 17. Public AI Documentation

(a) AI systems shall maintain transparent documentation.

(b) Documentation shall include:

(1) AI system purpose

(2) AI algorithm explanations

(3) Data sources

Section 18. Research and Development Funding

(a) Funds shall be allocated for research and development of AI technologies that promote:

(1) Transparency

(2) Accountability

(3) Journalistic integrity

(4) Constitutional protections

Section 19. Addressing 2013 Smith-Mundt Modernization Act Concerns

(a) Repeals provisions allowing domestic dissemination of propaganda.

(b) Strengthens protections against government-funded propaganda.

Section 20. Whistleblower Protections

(a) Whistleblowers reporting constitutional violations shall be protected.

(b) Identity of whistleblowers shall remain confidential.

Section 21. Transparency Reports

(a) Government agencies shall publish semiannual transparency reports.

(b) Reports shall include:

(1) Data collection and surveillance activities

(2) Constitutional violation investigations

(3) Penalties imposed

Section 22. Effective Date

This Act takes effect 180 days after enactment.

Section 23. Repeal of Obsolete Provisions

All laws inconsistent with this Act are repealed.

National Security AI Regulations

Section 24. Definitions

In this regulation:

“National security” means protection of the United States and its citizens from threats.

Section 25. AI Development and Deployment

(a) AI systems developed or deployed for national security purposes shall:

(1) Comply with constitutional rights and protections

(2) Undergo rigorous testing and validation

(3) Incorporate safeguards against bias and error

Section 26. National Security AI Review Board

(a) Establish an independent National Security AI Review Board.

(b) Board composition:

(1) 7 members appointed by President

(2) 2 members appointed by Senate

(3) 2 members appointed by House

Section 27. Guidelines for AI Development and Deployment

I. AI System Design

  1. Define clear objectives

  2. Ensure transparency

  3. Incorporate safeguards

II. Testing and Validation

  1. Rigorous testing

  2. Validation by independent experts

III. Bias Mitigation

  1. Data quality

  2. Algorithmic fairness

IV. Transparency and Explainability

  1. Clear documentation

  2. Explainable decision-making

V. Data Protection

  1. Secure data storage

  2. Access controls

Section 28. National Security AI Review Board Charter

I. Purpose

  1. Ensure AI safety

  2. Prevent misuse

II. Composition

  1. 11 members

  2. Diverse expertise

III. Responsibilities

  1. Review AI development

  2. Investigate misuses

IV. Reporting

  1. Annual reports

  2. Congressional testimony

Section 29. Protection of Constitutional Rights on the Internet

(a) The government shall not infringe upon constitutional rights on the internet.

(b) Government actions on the internet shall be transparent and subject to oversight.

Section 30. Device and Internet Security

(a) The government shall not compromise device or internet security.

(b) Government agencies shall implement robust security measures to protect against cyber threats.

Section 31. Conflict of Interest Prohibition

(a) No individual or organization with ties to:

(1) Intelligence agencies

(2) Investment firms

(3) Media conglomerates

shall participate in AI development, oversight, or decision-making processes.

Section 32. Independence and Transparency Requirements

(a) AI Oversight Agency (AOA) and Independent Development Team (IDT) members shall:

(1) Disclose financial interests and affiliations

(2) Recuse themselves from decisions involving conflicts of interest

(3) Maintain transparency in decision-making processes

Section 33. Funding Restrictions

(a) No funds from intelligence agencies or investment firms shall be allocated for AI development or oversight.

Section 34. Whistleblower Protections

(a) Whistleblowers reporting conflicts of interest or constitutional violations shall be protected.

Section 35. Oversight Committee

(a) Establish a bipartisan Congressional oversight committee to monitor compliance.

Section 36. Penalties for Non-Compliance

(a) Individuals or organizations violating conflict of interest provisions shall face penalties.

Section 37. Diverse Perspectives

ARE, AOE, and AOO development teams shall include diverse perspectives.

Section 38. Fact Accuracy and Data Verification

ARE, AOE, and AOO systems shall prioritize fact accuracy and data verification.

Section 39. AI-Assisted

Section 40. AI-Assisted Content Guidelines

Establish guidelines for ARE, AOE, and AOO-assisted content creation and curation.

National Security Review Board

Section 41. Authority and Procedures

The National Security Review Board shall investigate and enforce compliance with national security regulations related to ARE, AOE, and AOO.

Section 42. Classified Information Handling

Establish procedures for handling classified information related to ARE, AOE, and AOO.

Whistleblower Protections

Section 43. Reporting Channels

Establish clear channels for reporting concerns related to ARE, AOE, and AOO.

Section 44. Confidentiality and Protection

Ensure confidentiality and protection from retaliation.

Oversight Committee

Section 45. Composition and Selection

Define committee composition and selection process.

Section 46. Reporting Requirements

Establish regular reporting requirements.

Implementation Strategy

Section 47. Timeline and Milestones

Develop a detailed timeline with milestones and benchmarks.

Section 48. Key Performance Indicators

Identify key performance indicators (KPIs) for evaluation.

Section 49. International Collaboration

Collaborate with international organizations to establish global ARE, AOE, and AOO standards.

Section 50. Impact Assessment

Assess the impact of ARE, AOE, and AOO on employment and education.

Agency Implementation Strategy

Section 51. Introduction

This strategy outlines the implementation plan for the Smith-Mundt Transparency, Accountability, and Journalistic Integrity Act.

Section 52. Agency Roles and Responsibilities

  1. AI Oversight Agency (AOA)
  • Develop and enforce AI ethics guidelines

  • Conduct AI system audits and security evaluations

  • Investigate AI-related complaints and whistleblower reports

  1. Department of State (DOS)
  • Lead agency for public diplomacy and international broadcasting

  • Collaborate with AOA on AI-related initiatives

Section 53. Alternative AI Development Approach

To ensure transparency and trust, AI systems will be developed through:

  1. Independent Development Team (IDT)
  • Assemble experts from academia, industry, and civil society

  • Develop AI systems with open-source frameworks

  1. Public-Private Partnerships (PPPs)
  • Collaborate with trusted tech companies and research institutions

Section 54. Implementation Timeline

  • Month 1-3: Establish AOA and IDT/PPP frameworks

  • Month 4-6: Develop AI ethics guidelines and system audits

Section 55. Performance Metrics and Monitoring

  1. Track agency compliance with guidelines

  2. Monitor AI system security and integrity

Section 56. Stakeholder Engagement

  1. Congressional briefings and updates

  2. Public hearings and town halls

Section 57. Resource Allocation

  1. Personnel: Assign dedicated staff to AOA and IDT/PPP

  2. Funding: Allocate necessary resources for AI development and oversight

Section 58. Conclusion

Effective implementation of the Smith-Mundt Transparency, Accountability, and Journalistic Integrity Act requires a coordinated effort between agencies.

Appendix

AI Oversight Agency (AOA) proposal

Independent Development Team (IDT) structure

Public-Private Partnership (PPP) models

AI ethics guidelines and system audit protocols

5 Likes

I will edit this later. Upon rereading it one could assume these 3 ais are to monitor existing ais. However they are to scan apps for any backdoors that allow the government to spy on citizens. There are 3 of them to ensure that any altering sends alerts to multiple channels for transparency and it takes a vote to modify any relevant codes or algorithms. It will also scan devices that connect to iot (internet body of things) for potential risk such as where the data is being sent to and how secure it is. This would prevent our data to be used against us. IoT has the capacity to spy on us and give the government and big tech industry from using those metrics to modulate our future behavior. If they end up linking our behavior with our families and our DNA, they can then use higher levels ais to map future behavior and what people with those genetic markers are predisposed to. This is a passive psychological weapon. The utilization of media, devices and iot(internet of things)and ibot (internet body of things) are instruments for the fourth industrial revolution. These are highly sophisticated and pose a great danger. This is why I’ve tethered it to the revised smith Mundt bill. The media has been weaponized to the degree that it’s the opposite of what society depends on it for and those expectations are manipulated to propagandize due to the modernization of Smith Mundt in 2013. There is no accountability and misses the mark of expectations by their viewership.