{"payload":{"feedbackUrl":"https://github.com/orgs/community/discussions/53140","repo":{"id":641994909,"defaultBranch":"main","name":"mmf","ownerLogin":"ebsmothers","currentUserCanPush":false,"isFork":true,"isEmpty":false,"createdAt":"2023-05-17T15:35:52.000Z","ownerAvatar":"https://avatars.githubusercontent.com/u/24319399?v=4","public":true,"private":false,"isOrgOwned":false},"refInfo":{"name":"","listCacheKey":"v0:1685635000.590127","currentOid":""},"activityList":{"items":[{"before":null,"after":"d420757d206911d9f2d2b0db8df8933d8791b365","ref":"refs/heads/export-D46283948","pushedAt":"2023-06-01T15:56:40.590Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"ebsmothers","name":null,"path":"/ebsmothers","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/24319399?s=80&v=4"},"commit":{"message":"Patch OpenLLaMa tokenizer imports for transformers 4.29 compatibility\n\nSummary: OpenLLaMa tokenizer maps the import path to read from LLaMa's tokenizer; this is incompatible with MMF's transformers patch. Wrap module imports in a try/except block to fix these import errors\n\nReviewed By: ankitade\n\nDifferential Revision: D46283948\n\nfbshipit-source-id: 615baeedbdaa90d9aa6a67bd057272d0b67044f7","shortMessageHtmlLink":"Patch OpenLLaMa tokenizer imports for transformers 4.29 compatibility"}},{"before":null,"after":"316e250137ea9166f09c978528dcdffb012b8fe5","ref":"refs/heads/export-D45878336","pushedAt":"2023-05-17T15:36:52.269Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"ebsmothers","name":null,"path":"/ebsmothers","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/24319399?s=80&v=4"},"commit":{"message":"Skip transformers.models.llama* in MMF hf transformers patch\n\nSummary:\nPatch prevents error `ImportError: tokenizers>=0.13.3 is required for a normal functioning of this module, but found tokenizers==0.12.1`\n\n```\n***/mmf/__init__.py in \n 4 from mmf.utils.patch import patch_transformers\n 5\n----> 6 patch_transformers()\n 7\n 8 from mmf import common, datasets, models, modules, utils\n***/mmf/utils/patch.py in patch_transformers(log_incompatible)\n 65 if not module or module == \".\" or module[0] == \".\":\n 66 continue\n---> 67 sys.modules[f\"transformers.{module}\"] = importlib.import_module(\n 68 f\"transformers.models.{key}.{module}\"\n 69 )\n***/runtime/lib/python3.8/importlib/__init__.py in import_module(name, package)\n 137 break\n 138 level += 1\n--> 139 return _bootstrap._gcd_import(name[level:], package, level)\n 140\n 141\n***/libfb/py/import_proxy.py in wrapper(module)\n 59 wraps(_exec_module)\n 60 def wrapper(module: ModuleType) -> None:\n---> 61 _exec_module(module)\n 62 _fire_callbacks(module)\n 63\n***/python/parsh/autoreload/measurements.py in patched_exec_module(self_, module)\n 51 start = timer()\n 52 try:\n---> 53 orig_exec_module(self_, module)\n 54 finally:\n 55 _IS_TOP_LEVEL_IMPORT = is_top_level_import\n***/transformers/models/llama/tokenization_llama_fast.py in \n 22\n 23\n---> 24 require_version(\"tokenizers>=0.13.3\")\n 25\n 26 if is_sentencepiece_available():\n***/transformers/utils/versions.py in require_version(requirement, hint)\n 122 if want_ver is not None:\n 123 for op, want_ver in wanted.items():\n--> 124 _compare_versions(op, got_ver, want_ver, requirement, pkg, hint)\n 125\n 126\n***/transformers/utils/versions.py in _compare_versions(op, got_ver, want_ver, requirement, pkg, hint)\n 48 )\n 49 if not ops[op](version.parse(got_ver), version.parse(want_ver)):\n---> 50 raise ImportError(\n 51 f\"{requirement} is required for a normal functioning of this module, but found {pkg}=={got_ver}.{hint}\"\n 52 )\nImportError: tokenizers>=0.13.3 is required for a normal functioning of this module, but found tokenizers==0.12.1.\n```\n\nReviewed By: RylanC24, BruceChaun\n\nDifferential Revision: D45878336\n\nfbshipit-source-id: d80d22e07c733086d3c7b8ef86d134f5e7e41d57","shortMessageHtmlLink":"Skip transformers.models.llama* in MMF hf transformers patch"}}],"hasNextPage":false,"hasPreviousPage":false,"activityType":"all","actor":null,"timePeriod":"all","sort":"DESC","perPage":30,"cursor":"djE6ks8AAAADONTbcQA","startCursor":null,"endCursor":null}},"title":"Activity ยท ebsmothers/mmf"}