Bias 娛樂版頭條照妖鏡大家可能聽得多。AI 照妖鏡你又有無聽過? AI 嘅大型語言模型由大量嘅數據組成。如果數據本身帶有偏見,例如文化種 族定型或者性別歧視,AI 生成嘅資訊就會好似鏡咁將偏見一一呈現。 圖片生成技術普及初期就有人發現,輸入「成功人士」普遍會得出白人男子, 而輸入「建築工人」就會得出穿工裝嘅黑人。 試想像一下,將 AI 照妖鏡應用喺社會服務、招聘、管理等用途上,可能會出 現咩後果? 相信 AI 偏見短期內唔會消除,所以用 AI 生成技術一定要仔細審視當中嘅論 述,避免加極社會嘅不平等同歧視。---Based on the image provided, there is no textual content such as question stems, options, mathematical formulas, tables, or other relevant text related to a question. The image appears to be a still frame from a video featuring a person speaking. There are no charts or diagrams to describe in the context of a question or problem. Therefore, there is no content to extract according to the requirements.

视频信息