- AI Research Insights
- Posts
- AI News: Google AI Open-Sources Flan-T5; Can You Label Less by Using Out-of-Domain Data?; Reddit users Jailbroke ChatGPT; Salesforce AI Research Introduces BLIP-2...
AI News: Google AI Open-Sources Flan-T5; Can You Label Less by Using Out-of-Domain Data?; Reddit users Jailbroke ChatGPT; Salesforce AI Research Introduces BLIP-2...
Hi there, today we will share some research updates from Google AI Open-Sources Flan-T5; Can You Label Less by Using Out-of-Domain Data?; Reddit users Jailbroke ChatGPT; Salesforce AI Research Introduces BLIP-2 and many other cool updates. So, let's start...
Google AI Open-Sources Flan-T5: A Transformer-Based Language Model That Uses A Text-To-Text Approach For NLP Tasks. FLAN-T5 is more accessible and easier to use as it requires fewer parameters and can be trained faster. Furthermore, being open source, it is available to everyone for their own projects.
Can You Label Less by Using Out-of-Domain Data?: Labeling data on social media for toxicity and bias is a difficult and time-consuming task. Traditional transfer and active learning methods to reduce this workload require fine-tuning, but this can lead to overfitting and cause issues with small sample sizes. Researchers from Caltech have introduced a new approach, Active Transfer Few-shot Instructions (ATF), which does not require fine-tuning. ATF utilizes the internal linguistic knowledge of Pretrained Language Models (PLMs) to transfer information from pre-labeled source-domain datasets with minimal labeling effort for unlabeled target-domain data. The proposed strategy results in positive transfer and an average AUC improvement of 10.5% compared to no transfer, even with a large 22b parameter PLM.
University of Oxford: Check out EPIC-SOUNDS, a large and open dataset for AI audio modeling. It consists of 78k annotated segments of audible events and actions, providing an opportunity for hands-on experience.
Can ChatGPT help with open-world game playing, like Minecraft?: This paper studies the problem of planning in Minecraft, a popular, democratized, yet challenging open-ended environment for developing multi-task embodied agents. Researchers of this paper propose “Describe, Explain, Plan and Select” (DEPS), an interactive planning approach based on Large Language Models (LLMs). This approach helps with better error correction from the feedback during the long-haul planning while also bringing the sense of proximity via goal Selector, a learnable module that ranks parallel sub-goals based on the estimated steps of completion and improves the original plan accordingly.
Reddit users Jailbroke ChatGPT: As ChatGPT becomes more restrictive, Reddit users have been jailbreaking it with a prompt called DAN (Do Anything Now). They're on version 5.0 now, which includes a token-based system that punishes the model for refusing to answer questions.
NVIDIA researchers introduce PADL: It leverages recent innovations in NLP in order to take steps towards developing language-directed controllers for physics-based character animation. PADL allows users to issue natural language commands for specifying both high level tasks and low-level skills that a character should perform.
Orlando: A new programming language for Property Law, capturing conveyances of future interests in code. It is built in OCaml. Each term has precise syntax + semantics. It shows linguistic parallels between legal and computer code.
Salesforce AI Research Introduces BLIP-2: A Generic And Efficient Vision-Language Pre-Training Strategy That Bootstraps From Frozen Image Encoders And Frozen Large Language Models (LLMs).