What is Artificial Intelligence (AI)?
- Artificial Intelligence (AI) is the phrase used to refer to computer systems that mimic or exceed human thinking based on experience. AI systems rely on algorithms and models that can analyze large data sets to identify patterns for predictions or decision-making.
What is generative AI?
- Generative AI is a form of AI used to create new content, images, text, videos, music, etc., based on user inputs. Generative AI uses large data sets and algorithms to analyze patterns and rules to “learn” behavior and characteristics of training data sets, and it uses that “learned” information to produce materials that can be superficially convincing, but not always persuasive or accurate.
What is ChatGPT?
- ChatGPT is a natural language AI that allows the user to ask questions conversationally. It can help compose emails, essays, and even programming code.
What are AI image generators?
- AI image generators create realistic images and art from text descriptions or prompts. Ex. Jasper Art, Midjourney, Dall-E2
What are AI hallucinations?
- Hallucinations are when AI generates false information, perceptions of something that is not actually present or factual. Because AI is trained on large data sets to learn patterns, they may create new text, not necessarily based on facts.
What is AI bias?
- AI bias occurs when algorithms return systematically biased information based on incorrect assumptions. This can happen due to biases in the data used to train the systems or algorithms.
What information should I not feed a generative AI tool?
How can I subscribe to a generative AI tool?
- Contact DCB IT Department prior to purchasing any software applications that will be used on Dakota College at Bottineau computers.
Tips for the use of Generative AI Tools
The following guidelines were developed by UND and Minot State University and adapted to fit Dakota College at Bottineau with their permission.
- Responsibility - Content created by generative AI can include factual errors or inaccuracies, fabrications, bias, or other unreliable information. It is your responsibility to ensure the accuracy of the information reported in your work. Review all material produced for accuracy, violations or copyright protection, and plagiarism. Be sure to document and be transparent about the use of generative AI. Academic integrity policies must be followed and reviewed. Such policies can be found in the Student Handbook and the Employee Handbook.
- Security - Be vigilant about potential phishing attacks. Generative AI is rapidly changing as well as phishing attempts and AI makes it possible to establish more sophisticated phishing attacks and other attacks on your personal data and identifying information. Be sure to report any questionable emails to DCB IT Department or email NDUS Phishing Reporting at email@example.com.
Faculty & Staff FAQ's
What can I, as a faculty member, do to support academic integrity in relation to AI?
- Provide clear expectations regarding academic integrity and AI. Be sure to clarify for students your expectations regarding using any Generative AI tools or applications. State these expectations clearly on your course syllabus and in any assignment prompts. Be sure to explain the consequences for the students if your expectations regarding the use of Generative AI tools are not met.
- Discuss your expectations at the beginning of the course and frequently thereafter.
- Place clear statements on the course syllabus and on Blackboard.
- Be clear about whether using an automated tool such as ChatGPT is considered plagiarism.
- Note that websites that purport to detect the use of AI are flawed and have high occurrences of false positives and negatives. Any use of these should be transparent and students must consent to faculty submitting their work.
- Discuss with your students the challenges and opportunities that AI and automation present within your academic discipline and the subject of your courses. Acknowledge that other disciplines, courses, and faculty may have different expectations and understandings of appropriate use.
- Be transparent about your own uses of AI. This models good practice and contributes to a broader conversation about the potential uses, benefits, and challenges.
- Report academic integrity concerns by raising a flag in Starfish.
- See the Employee Handbook under “Faculty Responsibilities” for more information.
- We encourage policy discussions within your department and college about transparency and academic integrity around institutional, faculty, and student use of these tools. Greater understanding and clarity will benefit students as well as programs.
Can I use generative AI for course delivery and assessment?
- AI tools might be useful in teaching and assessment and could drive new course delivery methods. Dakota College at Bottineau will continue to have conversations and provide opportunities to learn more about AI and teaching. We encourage you to have conversations within your department about appropriate use in teaching and learning.
- Be wary of claims by third-party vendors and look for provable results before adopting such tools in your courses.
As a staff member can I use AI for my job?
- It is important to have conversations on the use of AI with your office and/or supervisor to determine how and where it can be used ethically and effectively in your area.
- Be transparent about your use of AI. Cite and attribute work that is generated by AI.
- It is your responsibility to ensure the accuracy of what is reported in your work. Review all material produced for accuracy, violations of copyright protections, and plagiarism.
How is AI being used in my daily work?
- You are already encountering AI in your daily work. If you use Microsoft Word and, for example, as you type it auto-fills your sentence, AI is being used. If you chat with a company online, a chatbot may be used for initial customer service.
- Use of AI such as Chat GPT can be useful in putting together agendas, technical writing explaining a concept, or other administrative tasks, but always ensure the accuracy of the information you receive when using Generative AI.
Can I use AI for my course assignments?
- Different fields, courses, and instructors will have different policies and guidelines for how AI can or cannot be used. It is important not to make assumptions about what is allowed and to ask for clarification when needed.
- When submitting your work for credit, it is assumed to be your original work. The use of other resources, including generative AI models, like ChatGPT or Bard or DALL·E 2 and Midjourney, must be documented.
- Generative AI and large language models are not designed to establish proof or provide accurate facts. It is your responsibility to ensure the accuracy of what you submit.
- When in doubt about what’s allowed in a given course, clarify it with your professor/instructor.
- Know DCB’s policy on Academic Honesty/Dishonesty which can be found in the Student Handbook.
Special thank you extended to the University of North Dakota and Minot State University who provided much of this content with permission to share it and adapt it to fit Dakota College at Bottineau campus community.